You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/03/03 15:48:51 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #28

See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/28/display/redirect?page=changes>

Changes:

[github] Revert "[BEAM-8335] Add PCollection to DataFrame logic for


------------------------------------------
[...truncated 5.76 MB...]
    ]
  }, 
  "name": "beamapp-jenkins-0303151257-706659", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input4edfbfe5-33a7-43f5-b506-4a6f1d5cef21"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input4edfbfe5-33a7-43f5-b506-4a6f1d5cef21", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output4edfbfe5-33a7-43f5-b506-4a6f1d5cef21", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-03T15:13:17.126512Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-03_07_13_15-7923392266024362041'
 location: u'us-central1'
 name: u'beamapp-jenkins-0303151257-706659'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-03T15:13:17.126512Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-03_07_13_15-7923392266024362041]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_13_15-7923392266024362041?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-03_07_13_15-7923392266024362041 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:15.731Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-03_07_13_15-7923392266024362041. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:15.731Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:15.731Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-03_07_13_15-7923392266024362041.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:19.428Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:20.608Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.248Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.281Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.345Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.394Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.425Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.463Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.496Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.586Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.621Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.677Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.751Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.814Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.854Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:21.884Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T15:13:48.286Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-03_07_13_15-7923392266024362041 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2158.344s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_13_14-4837719249639431834?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_22_09-7309337841328457629?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_32_14-632556571384094611?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_40_25-2523922033880141507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_13_15-7923392266024362041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_22_00-16708983592784101013?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_30_54-9958082887629704381?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_13_13-6340106320468709413?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_21_56-12954451352193776281?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_31_57-3649246692278896890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_13_13-3478042489059731189?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_21_12-8041614680019463682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_29_13-8874854015137162611?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_13_14-868558776649973643?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_22_27-481486678734698421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_30_43-9676410141236738832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_13_16-16390949604032669487?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_22_35-853614293906898831?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_30_55-18145497025711319327?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_13_16-9997919535181269169?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_21_48-17347257113415581407?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_30_59-7376578243382849745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_13_13-9150156513785991936?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_07_22_23-702752593736041990?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 20s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/qp3burbodllt6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #131

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/131/display/redirect>

Changes:


------------------------------------------
[...truncated 5.44 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-16T13:14:58.481147Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-16_06_14_57-1092353974894782244'
 location: u'us-central1'
 name: u'beamapp-jenkins-0316131440-196324'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-16T13:14:58.481147Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-16_06_14_57-1092353974894782244]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_06_14_57-1092353974894782244?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-16_06_14_57-1092353974894782244 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:14:57.446Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-16_06_14_57-1092353974894782244. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:14:57.446Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:14:57.446Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-16_06_14_57-1092353974894782244.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:01.089Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:01.748Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.300Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.327Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.371Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.401Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.427Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.453Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.485Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.556Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.637Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.691Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.792Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.817Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.838Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.870Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.901Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.948Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:02.986Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.061Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.095Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.124Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.147Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.175Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.200Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.227Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.249Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.273Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.295Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.324Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.346Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.374Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.396Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.434Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.460Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.489Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:03.513Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:05.796Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:05.832Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:05.874Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:32.069Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:15:39.978Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:16:05.554Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:16:05.583Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:21:04.760Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:21:09.216Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:21:09.312Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:21:09.352Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:21:09.401Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:21:09.449Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:22:43.325Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:22:43.372Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T13:22:43.407Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-16_06_14_57-1092353974894782244 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2140.586s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_47_36-11544927878181811117?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_56_31-16286757301105039092?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_06_06_16-15867715685456866971?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_06_14_57-1092353974894782244?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_47_35-4457263578046743609?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_56_31-16117432307606884317?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_06_05_43-2201020776580281421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_47_36-3821088756380695479?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_56_01-407873900547070395?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_06_04_11-2483193087219417292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_47_37-11655097077961724644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_55_23-6175406880977706160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_06_04_38-4102926025088366347?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_47_33-1780797054111563271?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_57_19-16057601647672209898?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_47_36-3425752233599708904?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_56_24-2130587330893451124?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_06_05_14-3698624903442800802?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_47_37-1965094887369419947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_55_40-3218719463955045057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_06_04_55-15663049651630045277?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_47_34-190880563338861085?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_05_56_34-926479440407385516?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_06_05_21-1562060856759835856?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 46s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/tkzni6xqdxsba

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #130

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/130/display/redirect?page=changes>

Changes:

[github] [BEAM-9346] Improve the efficiency of TFRecordIO (#11122)


------------------------------------------
[...truncated 5.43 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-16T10:20:19.876416Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-16_03_20_18-6093234224826958335'
 location: u'us-central1'
 name: u'beamapp-jenkins-0316102002-673612'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-16T10:20:19.876416Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-16_03_20_18-6093234224826958335]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_20_18-6093234224826958335?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-16_03_20_18-6093234224826958335 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:18.946Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:18.946Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-16_03_20_18-6093234224826958335. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:18.947Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-16_03_20_18-6093234224826958335.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:21.840Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:22.560Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.065Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.107Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.180Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.204Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.240Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.282Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.323Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.420Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.533Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.584Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.623Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.664Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.699Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.741Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.775Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.818Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.857Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.892Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.933Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:23.970Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.007Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.044Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.082Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.121Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.157Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.184Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.210Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.235Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.265Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.303Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.336Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.376Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.409Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.440Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.476Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:24.514Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:26.803Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:26.841Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:26.877Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:38.952Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:20:53.221Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:21:19.459Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:21:19.492Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:26:25.710Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:26:27.714Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:26:27.761Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:26:27.796Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:26:27.834Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:26:27.865Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:28:01.873Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:28:01.930Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T10:28:01.960Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-16_03_20_18-6093234224826958335 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2113.348s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_02_53_24-10132605718287545035?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_02_25-9327766205917590517?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_11_09-3818616090565458397?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_20_18-6093234224826958335?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_02_53_23-9325517365728780625?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_03_22-10665222991260528135?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_02_53_24-4610905872544226067?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_00_39-12771707782512927868?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_09_32-4400372937473190232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_02_53_24-17581121878681775559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_01_53-18446683607428796130?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_10_02-1234247459074411878?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_02_53_22-17620841033265929491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_01_32-17211571049938045862?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_10_31-3190273627253669505?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_02_53_25-9596878510163805071?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_02_44-5715639818882038786?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_10_44-8208455928293893822?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_02_53_25-1151007856523606024?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_01_47-17537584249533429935?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_10_48-210885158689219450?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_02_53_22-14468209155166863447?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_01_33-17730108385761750184?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_03_10_23-15551619023095765362?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 51s
64 actionable tasks: 48 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/sgbzoyihzkxfa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #129

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/129/display/redirect>

Changes:


------------------------------------------
[...truncated 5.44 MB...]
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-16T07:16:16.054635Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-16_00_16_15-7553723596099324717'
 location: u'us-central1'
 name: u'beamapp-jenkins-0316071559-561612'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-16T07:16:16.054635Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-16_00_16_15-7553723596099324717]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_00_16_15-7553723596099324717?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-16_00_16_15-7553723596099324717 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:15.054Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-16_00_16_15-7553723596099324717. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:15.054Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:15.054Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-16_00_16_15-7553723596099324717.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:18.065Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:18.692Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.214Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.253Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.310Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.345Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.367Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.410Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.444Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.550Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.642Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.712Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.741Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.779Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.806Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.843Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.934Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:19.964Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.034Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.068Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.107Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.142Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.173Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.195Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.229Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.250Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.283Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.319Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.353Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.390Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.426Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.461Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.499Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.534Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.560Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.592Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:20.630Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:22.847Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:22.887Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:22.927Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:48.839Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:16:52.625Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:17:40.590Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:17:40.626Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:22:21.805Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:23:26.555Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:23:26.654Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:23:26.688Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:23:26.727Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:23:26.760Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:24:22.327Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:24:22.380Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T07:24:22.414Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-16_00_16_15-7553723596099324717 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2146.824s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_49_13-3705757632296883988?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_58_07-17435178414558312412?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_00_06_55-1901203315844850110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_00_16_15-7553723596099324717?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_49_11-4427242818099326140?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_58_01-12619938645858688344?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_49_11-15462437689635308613?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_57_18-17692726874263475025?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_00_06_33-8508138181659676017?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_49_14-10323106824694424330?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_56_56-17351890642803838329?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_00_05_45-2524454867986642631?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_49_13-5604475922305425307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_57_04-10080523031019284276?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_00_05_59-17336921332112758348?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_49_09-1046018042343001177?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_57_58-5609727553592121567?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_00_06_43-6587500102839376028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_49_15-14538798587114379911?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_58_04-10399824298727220618?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_00_06_02-8273968859981909665?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_49_11-8084858625703308339?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_23_57_11-6641645420014486101?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-16_00_05_36-18180197336757424147?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 52s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/2cnvivfyj3nmg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #128

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/128/display/redirect>

Changes:


------------------------------------------
[...truncated 5.44 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-16T01:11:28.535950Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-15_18_11_26-18399912318736135086'
 location: u'us-central1'
 name: u'beamapp-jenkins-0316011111-620232'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-16T01:11:28.535950Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-15_18_11_26-18399912318736135086]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_18_11_26-18399912318736135086?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-15_18_11_26-18399912318736135086 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:27.007Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-15_18_11_26-18399912318736135086. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:27.007Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:27.008Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-15_18_11_26-18399912318736135086.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:31.014Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:31.822Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.296Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.325Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.397Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.451Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.494Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.563Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.623Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.736Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.843Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.909Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.946Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:32.976Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.013Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.045Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.082Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.124Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.163Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.201Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.240Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.282Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.316Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.356Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.392Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.432Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.471Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.562Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.605Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.643Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.678Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.718Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.749Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.781Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.836Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.870Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.906Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:33.941Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:48.900Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:48.931Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:48.968Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:11:58.586Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:12:12.898Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:12:53.613Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:12:53.650Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:17:35.277Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:18:49.930Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:18:49.987Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:18:50.012Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:18:50.054Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:18:50.086Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:20:01.276Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:20:01.327Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-16T01:20:01.364Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-15_18_11_26-18399912318736135086 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2158.559s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_44_30-2093933438884145594?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_53_34-259996465684995901?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_18_02_28-15119161665718982573?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_18_11_26-18399912318736135086?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_44_32-7832728792107219510?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_52_12-6611491636554023824?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_18_01_05-5307329467599918051?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_44_32-4343541483993246738?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_53_35-4550169054324441040?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_18_02_18-16732331318005291801?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_44_32-12757860204138397480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_53_13-13535570291734587409?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_18_02_07-17925790960060533678?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_44_32-592842600123389579?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_53_27-17096388157673465172?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_18_02_15-9286667420682877925?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_44_32-744374357123635043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_53_28-2841278497313129690?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_44_32-12144187694958711516?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_53_26-9775449440201050809?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_18_02_21-5758145780402249309?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_44_30-6939740345381775090?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_17_53_26-15096847898328606686?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_18_02_05-15878199492635136266?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 8s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/2rhhtzf4d2wha

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #127

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/127/display/redirect>

Changes:


------------------------------------------
[...truncated 5.44 MB...]
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-15T19:12:27.367138Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-15_12_12_26-16520164591068965028'
 location: u'us-central1'
 name: u'beamapp-jenkins-0315191211-243022'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-15T19:12:27.367138Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-15_12_12_26-16520164591068965028]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_12_12_26-16520164591068965028?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-15_12_12_26-16520164591068965028 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:26.313Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-15_12_12_26-16520164591068965028.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:26.313Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-15_12_12_26-16520164591068965028. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:26.313Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:30.203Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:31.116Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:31.692Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:31.731Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:31.816Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:31.855Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:31.898Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:31.941Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:31.975Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.123Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.221Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.282Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.323Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.364Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.400Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.434Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.467Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.497Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.531Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.561Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.603Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.634Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.669Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.708Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.742Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.778Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.810Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.847Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.880Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.914Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.951Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:32.986Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:33.029Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:33.064Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:33.111Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:33.153Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:33.187Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:33.222Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:35.577Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:35.620Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:12:35.699Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:13:03.056Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:13:09.642Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:13:48.761Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:13:48.798Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:18:34.708Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:19:38.697Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:19:38.766Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:19:39.074Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:19:39.170Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:19:39.197Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:21:08.244Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:21:08.280Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T19:21:08.308Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-15_12_12_26-16520164591068965028 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2167.910s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_45_32-16904411984762928448?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_54_37-2279282604168334357?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_12_03_42-16584852019295167366?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_12_12_26-16520164591068965028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_45_34-3550332600174574201?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_53_59-12803633130811275156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_12_01_59-10241744754816859321?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_45_34-1427320013406246439?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_53_48-9824399789810083424?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_12_03_03-7252718369185758421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_45_35-5568684803364571836?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_54_23-8268616269117078990?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_45_32-18086981062403628080?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_54_40-10985868674036958348?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_12_02_55-17017371463741056289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_45_34-9589728485304766249?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_54_22-16647285750402762565?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_12_03_28-8444308171483179423?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_45_34-11140924251868462241?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_54_26-12677918841102501406?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_12_02_31-387387035941053166?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_45_32-6092526764971403575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_11_54_37-3279281931194052057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_12_02_36-3464114431758016485?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 9s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/luxyb5gmr4bzm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #126

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/126/display/redirect>

Changes:


------------------------------------------
[...truncated 5.42 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-15T13:12:24.916484Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-15_06_12_23-12908926038509056066'
 location: u'us-central1'
 name: u'beamapp-jenkins-0315131209-061692'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-15T13:12:24.916484Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-15_06_12_23-12908926038509056066]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_06_12_23-12908926038509056066?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-15_06_12_23-12908926038509056066 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:23.774Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:23.774Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-15_06_12_23-12908926038509056066. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:23.774Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-15_06_12_23-12908926038509056066.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:26.883Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:27.482Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.072Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.111Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.189Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.227Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.251Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.293Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.335Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.456Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.566Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.627Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.658Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.683Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.712Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.748Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.789Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.824Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.866Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.905Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.940Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:28.978Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.007Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.033Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.068Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.108Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.132Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.170Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.211Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.248Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.286Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.329Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.361Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.396Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.442Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.485Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.520Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:29.556Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:31.813Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:31.845Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:31.888Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:12:49.679Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:13:01.017Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:13:38.024Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:13:38.052Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:18:30.770Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:19:33.217Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:19:33.262Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:19:33.292Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:19:33.334Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:19:33.364Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:21:05.515Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:21:05.562Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T13:21:05.602Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-15_06_12_23-12908926038509056066 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2203.148s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_44_53-15227361981606892050?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_53_52-14789849273332459769?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_06_03_00-13403323942980288120?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_06_12_23-12908926038509056066?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_44_53-14734470830076048646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_53_56-580862287702906856?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_06_02_52-12567719216705962755?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_44_55-2631276243084490964?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_52_48-17817975577320429582?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_06_02_22-15171610902490766510?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_44_55-4343182623641439285?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_53_00-14736073052178780122?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_06_00_44-7129244160426282920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_44_55-18264864582871266421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_52_35-1441788825822393151?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_06_00_34-13292608711354960168?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_44_52-372600769791729179?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_52_42-2806487068640718808?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_06_00_47-7593897737233465486?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_44_55-8619349793991742669?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_53_59-11218220749011995497?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_44_55-11183257370146538774?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_05_53_55-14881448214782407042?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_06_02_56-3005200067222029015?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 18s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ohv4bhgbrimc4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #125

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/125/display/redirect>

Changes:


------------------------------------------
[...truncated 5.42 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-15T07:10:11.988657Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-15_00_10_11-13038260149090419689'
 location: u'us-central1'
 name: u'beamapp-jenkins-0315070956-405527'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-15T07:10:11.988657Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-15_00_10_11-13038260149090419689]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_00_10_11-13038260149090419689?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-15_00_10_11-13038260149090419689 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:11.083Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-15_00_10_11-13038260149090419689. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:11.083Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:11.083Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-15_00_10_11-13038260149090419689.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:14.853Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:15.511Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.094Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.131Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.213Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.253Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.285Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.325Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.362Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.440Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.551Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.614Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.642Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.676Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.699Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.724Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.752Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.786Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.815Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.850Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.884Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.919Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.956Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:16.993Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.027Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.056Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.086Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.117Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.153Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.178Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.200Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.233Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.265Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.303Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.339Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.367Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.392Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:17.424Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:19.701Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:19.741Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:19.779Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:47.190Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:10:54.866Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:11:19.701Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:11:19.734Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:16:18.676Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:17:22.811Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:17:22.885Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:17:22.936Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:17:23.323Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:17:23.365Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:18:53.960Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:18:53.995Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T07:18:54.025Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-15_00_10_11-13038260149090419689 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2140.163s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_43_42-16589744759029921669?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_52_54-7747694925503636789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_00_01_53-6728161310563688669?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_00_10_11-13038260149090419689?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_43_43-6602125548215667380?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_51_08-13347878321040188385?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_00_00_06-2934676411408966215?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_43_43-7207094483693261529?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_52_56-598571176790019433?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_43_42-4714864194237509366?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_51_26-6796827268740907670?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_59_24-5708846189403128948?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_43_40-14000624195545829082?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_51_25-14536870846279479757?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_00_00_28-5485524331315211538?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_43_43-5288487904899576794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_51_32-14613189189250397796?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_00_00_52-8298369945054572417?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_43_44-3719535633194926317?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_51_32-12235938937593978976?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_59_29-17710416525974190635?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_43_41-14791170749632196631?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_23_51_45-10320079281859188970?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-15_00_01_04-8167579611571573072?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 16s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ddouezpccw4b2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #124

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/124/display/redirect>

Changes:


------------------------------------------
[...truncated 5.50 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-15T01:11:55.193737Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-14_18_11_54-6208634692893188896'
 location: u'us-central1'
 name: u'beamapp-jenkins-0315011138-404269'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-15T01:11:55.193737Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-14_18_11_54-6208634692893188896]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_18_11_54-6208634692893188896?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_18_11_54-6208634692893188896 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:54.086Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:54.086Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-14_18_11_54-6208634692893188896. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:54.086Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-14_18_11_54-6208634692893188896.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:57.357Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:58.229Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:58.862Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:58.905Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:58.968Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:59.009Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:59.034Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:59.071Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:59.104Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:59.215Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:11:59.333Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.008Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.122Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.163Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.200Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.234Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.271Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.305Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.348Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.379Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.407Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.449Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.489Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.525Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.563Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.587Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.620Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.661Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.691Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.733Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.765Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.792Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.823Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.858Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.907Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.937Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:00.973Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:01.008Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:03.251Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:03.305Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:03.356Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:29.561Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:30.604Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:55.503Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:12:55.540Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:18:02.240Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:19:05.750Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:19:05.803Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:19:05.831Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:19:05.863Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:19:05.894Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:20:21.693Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:20:21.744Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-15T01:20:21.780Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_18_11_54-6208634692893188896 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2186.641s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_44_27-1928378256962925093?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_53_18-838768243788808069?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_18_03_09-16376488445518671393?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_18_11_54-6208634692893188896?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_44_26-5401171313451517726?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_54_19-4525142505316523583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_18_03_07-2740717980774130012?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_44_27-11743770705388718103?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_52_10-1795557720674088687?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_18_01_05-4031603282129151351?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_44_29-9882019003011342584?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_54_14-15852087226861642393?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_44_27-8042406273856713605?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_53_08-14118738790079329852?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_18_02_12-8088024419840196794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_44_24-708444267543794620?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_53_10-12254427421077346155?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_18_02_21-11152338911885627?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_44_27-1567235235941053734?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_53_21-6943594395760557282?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_18_02_06-8010600098202443972?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_44_26-15299761029407040066?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_17_53_21-16949416273076797464?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_18_02_15-16528327152303013094?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 30s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/4bve276juf6pg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #123

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/123/display/redirect>

Changes:


------------------------------------------
[...truncated 5.44 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-14T19:15:48.989478Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-14_12_15_41-8912082830664365753'
 location: u'us-central1'
 name: u'beamapp-jenkins-0314191526-628267'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-14T19:15:48.989478Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-14_12_15_41-8912082830664365753]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_12_15_41-8912082830664365753?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_12_15_41-8912082830664365753 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:41.667Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:41.667Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-14_12_15_41-8912082830664365753.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:41.667Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-14_12_15_41-8912082830664365753. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:51.655Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:52.454Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.019Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.059Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.120Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.159Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.190Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.222Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.259Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.366Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.504Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.554Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.587Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.617Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.642Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.677Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.704Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.739Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.771Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.806Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.832Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.868Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.892Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.925Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.952Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:53.987Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.021Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.057Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.093Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.129Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.166Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.192Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.228Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.254Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.297Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.328Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.360Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:54.390Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:57.425Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:57.455Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:15:57.492Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:16:05.965Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:16:22.136Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:16:54.247Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:16:54.283Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:21:56.400Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:22:58.594Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:22:58.655Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:22:58.694Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:22:58.728Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:22:58.778Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:24:00.004Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:24:00.057Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T19:24:00.115Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_12_15_41-8912082830664365753 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2182.300s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_48_20-13439477968348871763?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_58_04-488310178453718292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_12_06_53-3368042727168256681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_12_15_41-8912082830664365753?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_48_19-6346216542886907894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_57_06-11976514128501301449?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_12_06_06-16890769272729768793?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_48_20-4605797650604006665?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_56_29-9122400683291934800?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_12_05_08-6794715842732950041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_48_20-16950751203004525567?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_58_15-13324114184721871110?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_12_06_29-4179636208182511492?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_48_19-11870398686181883312?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_57_11-617419122329790612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_12_06_25-4970313754762634295?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_48_20-4412567108341133052?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_57_16-1110644197533388888?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_12_06_21-7654506479302018381?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_48_21-13676138368719669163?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_58_02-9418692508590686256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_48_18-9988757494347560639?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_11_57_15-3845480180059645009?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_12_06_10-13079564480191737654?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 13s
64 actionable tasks: 48 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/7brsuf2yzfts4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #122

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/122/display/redirect?page=changes>

Changes:

[mxm] [BEAM-9490] Guard referencing for environment expiration via a lock

[thw] [BEAM-9490] Use the lock that belongs to the cache when bundle load


------------------------------------------
[...truncated 5.42 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-14T15:05:00.344662Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-14_08_04_59-11751460595727913791'
 location: u'us-central1'
 name: u'beamapp-jenkins-0314150442-982990'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-14T15:05:00.344662Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-14_08_04_59-11751460595727913791]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_08_04_59-11751460595727913791?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_08_04_59-11751460595727913791 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:04:59.123Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-14_08_04_59-11751460595727913791. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:04:59.123Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-14_08_04_59-11751460595727913791.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:04:59.123Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:02.129Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:02.798Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.343Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.371Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.447Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.489Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.518Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.546Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.586Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.725Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.824Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.884Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.931Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.962Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:03.994Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.021Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.052Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.078Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.103Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.128Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.165Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.196Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.236Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.269Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.307Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.384Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.448Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.486Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.516Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.553Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.579Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.603Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.638Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.676Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.709Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.740Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.763Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:04.795Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:07.022Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:07.046Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:07.089Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:37.127Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:05:41.523Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:06:08.397Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:06:08.429Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:11:06.035Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:12:07.990Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:12:08.079Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:12:08.163Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:12:08.198Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:12:08.246Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:13:33.450Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:13:33.487Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T15:13:33.509Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_08_04_59-11751460595727913791 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2236.727s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_36_51-5332053272381257775?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_45_46-756065797054365395?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_55_39-1515991819654578893?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_08_04_59-11751460595727913791?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_36_49-12168323101416408841?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_45_53-1557861246590352586?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_54_51-988800409676763305?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_36_51-8354899346968862184?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_44_31-17742931188827852723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_53_28-3807873170831808555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_36_50-8008042696106858900?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_46_04-12088623691344608227?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_54_54-6419311800921882526?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_36_50-13436754959922286790?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_44_58-16317430522141232135?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_54_11-17184240593054745650?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_36_48-1114475104704113798?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_46_46-8680298306620840106?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_55_12-16323839808889936522?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_36_50-17950768899856968083?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_46_48-13735078092882650746?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_36_49-14442707578770728290?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_45_57-15324210902955707980?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_07_54_51-4275725093208060173?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 54s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/i3ddgujmbrqvu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #121

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/121/display/redirect>

Changes:


------------------------------------------
[...truncated 5.49 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-14T13:13:04.632473Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-14_06_13_03-532541153450390161'
 location: u'us-central1'
 name: u'beamapp-jenkins-0314131248-137196'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-14T13:13:04.632473Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-14_06_13_03-532541153450390161]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_06_13_03-532541153450390161?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_06_13_03-532541153450390161 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:03.662Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:03.662Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-14_06_13_03-532541153450390161.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:03.662Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-14_06_13_03-532541153450390161. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:06.878Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:08.007Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.022Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.228Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.420Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.519Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.549Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.582Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.606Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.704Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.791Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.864Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.898Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.927Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.965Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:09.993Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.039Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.068Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.108Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.145Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.185Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.213Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.244Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.283Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.310Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.338Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.360Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.389Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.412Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.438Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.461Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.489Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.517Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.544Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.614Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.659Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.688Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:10.726Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:16.512Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:16.537Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:16.562Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:34.225Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:13:41.345Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:14:17.649Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:14:17.737Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:19:11.945Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:20:19.598Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:20:19.652Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:20:19.688Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:20:19.729Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:20:19.773Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:22:03.694Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:22:04.071Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T13:22:04.209Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_06_13_03-532541153450390161 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2272.464s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_44_44-5801443539111303740?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_54_25-17859212446237224899?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_06_04_04-7944333752565558868?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_06_13_03-532541153450390161?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_44_45-10658414224524509775?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_54_10-3522101220009125058?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_06_02_20-15511471999704164005?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_44_45-17696520452638962822?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_53_03-7322861411058598069?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_06_02_52-401061529752153115?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_44_46-3071520402027077184?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_54_25-8456671576135439806?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_06_03_41-5892629086018419202?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_44_46-9483633500076767762?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_54_20-976899982390721935?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_06_03_36-8934007552393246?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_44_43-16730853929563219715?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_54_20-10346722850006220293?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_06_02_21-93687088430689825?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_44_45-17066877386807990514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_54_08-12347679143697441008?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_06_02_17-2644925541961387184?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_44_46-11861938884874213203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_05_54_25-10691568985262590055?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 26s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ncqhklscz2vek

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #120

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/120/display/redirect>

Changes:


------------------------------------------
[...truncated 5.47 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-14T07:11:38.360544Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-14_00_11_36-12399839469489123658'
 location: u'us-central1'
 name: u'beamapp-jenkins-0314071120-578007'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-14T07:11:38.360544Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-14_00_11_36-12399839469489123658]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_00_11_36-12399839469489123658?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_00_11_36-12399839469489123658 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:36.621Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-14_00_11_36-12399839469489123658.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:36.621Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-14_00_11_36-12399839469489123658. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:36.621Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:40.964Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:41.899Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:42.517Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:42.542Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:42.609Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:42.642Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:42.671Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:42.705Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:42.740Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:42.845Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:42.954Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.013Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.050Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.087Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.116Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.150Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.177Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.210Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.238Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.272Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.299Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.336Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.375Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.407Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.438Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.470Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.504Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.536Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.563Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.586Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.611Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.645Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.674Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.711Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.758Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.798Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.826Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:43.860Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:46.069Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:46.099Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:11:46.128Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:12:06.053Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:12:18.258Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:12:49.098Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:12:49.143Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:17:45.038Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:18:47.993Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:18:48.056Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:18:48.096Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:18:48.134Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:18:48.160Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:20:13.009Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:20:13.054Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T07:20:13.094Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-14_00_11_36-12399839469489123658 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2177.496s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_44_32-4253669136607864819?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_53_21-17620710771076676771?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_00_02_26-2890932325302855995?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_00_11_36-12399839469489123658?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_44_31-5120272136184078544?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_52_43-2294268124063712997?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_00_00_34-14498460625890844360?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_44_32-16348576698162986985?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_52_34-12471311889394014430?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_00_00_27-3084721740365030761?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_44_33-1790871954610187005?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_53_21-13919078720102882964?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_44_30-16064096372028726146?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_52_31-5479891854033276234?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_00_00_51-15881469637553149421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_44_33-2025516658828583323?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_52_29-11775941172147629042?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_00_00_33-2612836477588839984?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_44_33-15693219797274907170?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_53_26-9621333627459934566?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_00_02_23-14048158553425374764?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_44_32-5106174136398829595?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_23_53_20-2488225180389107482?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-14_00_02_07-13071567797845861760?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 39s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/qghrturjbyjho

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #119

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/119/display/redirect?page=changes>

Changes:

[rohde.samuel] [BEAM-8335] Final PR to merge the InteractiveBeam feature branch


------------------------------------------
[...truncated 5.50 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-14T02:08:23.323681Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-13_19_08_22-5805701378486950026'
 location: u'us-central1'
 name: u'beamapp-jenkins-0314020802-476121'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-14T02:08:23.323681Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-13_19_08_22-5805701378486950026]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_19_08_22-5805701378486950026?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_19_08_22-5805701378486950026 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:22.302Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-13_19_08_22-5805701378486950026. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:22.302Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:22.302Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-13_19_08_22-5805701378486950026.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:25.500Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:26.331Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:26.883Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:26.904Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:26.973Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.012Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.047Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.083Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.123Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.219Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.294Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.359Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.400Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.434Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.468Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.494Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.519Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.553Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.585Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.607Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.643Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.671Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.703Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.739Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.770Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.800Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.829Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.858Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.890Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.918Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.953Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:27.989Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:28.023Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:28.062Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:28.098Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:28.203Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:28.242Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:28.273Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:32.684Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:32.724Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:32.762Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:57.069Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:08:57.284Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:09:41.372Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:09:41.405Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:14:31.691Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:15:34.441Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:15:34.495Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:15:34.531Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:15:34.571Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:15:34.610Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:16:27.675Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:16:27.728Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T02:16:27.780Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_19_08_22-5805701378486950026 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2098.135s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_42_18-7995528119377746898?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_50_56-18190215762594152269?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_59_39-13151358149625945130?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_19_08_22-5805701378486950026?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_42_14-3381059249023795211?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_50_54-14648308655327032746?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_59_03-15682762307791298289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_42_16-38184278861997193?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_49_53-12850737495823526485?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_58_22-15524294436181781779?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_42_17-11663835880946469168?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_50_15-2169793381835590008?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_59_08-11560610095009248527?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_42_17-6590434341411820580?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_50_51-6333026304854292918?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_59_36-6144352080879110696?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_42_13-5953232171150424698?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_50_44-5364201423511963477?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_59_20-4492640929391964558?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_42_15-12190048685589893931?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_50_58-2627036782807572145?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_42_14-6287134868560631209?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_50_36-6922544025200030311?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_18_59_10-7674937407838944391?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 15m 51s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/pijm6pambgcny

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #118

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/118/display/redirect?page=changes>

Changes:

[github] Update default value in Java snippet

[github] [BEAM-9477] RowCoder should be hashable and picklable (#11088)


------------------------------------------
[...truncated 5.45 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-14T00:10:58.518372Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-13_17_10_57-1907708509268727757'
 location: u'us-central1'
 name: u'beamapp-jenkins-0314001042-435284'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-14T00:10:58.518372Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-13_17_10_57-1907708509268727757]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_17_10_57-1907708509268727757?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_17_10_57-1907708509268727757 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:10:57.533Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:10:57.533Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-13_17_10_57-1907708509268727757.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:10:57.533Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-13_17_10_57-1907708509268727757. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:01.080Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:03.745Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.367Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.394Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.468Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.506Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.536Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.581Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.611Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.704Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.813Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.880Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.909Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.932Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:04.972Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.004Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.041Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.077Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.113Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.147Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.177Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.211Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.249Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.287Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.322Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.356Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.390Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.415Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.452Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.486Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.525Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.568Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.607Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.641Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.691Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.727Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.762Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:05.794Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:08.022Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:08.065Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:08.108Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:12.759Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:11:36.331Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:12:06.941Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:12:07.036Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:17:06.972Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:18:09.780Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:18:09.989Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:18:10.040Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:18:10.129Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:18:10.178Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:19:09.666Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:19:09.709Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-14T00:19:09.749Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_17_10_57-1907708509268727757 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2115.255s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_44_24-4254005971960119662?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_52_23-12146893911034271483?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_17_02_13-6415123872904633825?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_17_10_57-1907708509268727757?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_44_24-13225210400874531307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_52_51-9105468466045695850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_44_22-17571904308734031442?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_52_58-4639220338409650970?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_17_01_29-17047960994774862032?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_44_23-8572304618269673560?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_52_53-15294737707994211243?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_17_01_18-17164657353119948543?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_44_24-5095436060360372246?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_52_53-396541644786816257?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_17_01_16-9749507001574399148?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_44_24-5660720032538169660?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_52_51-141817272792328131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_17_01_26-11020598178388314617?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_44_25-207531720310419241?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_52_58-5640402590744124953?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_17_01_33-4067732953345865975?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_44_23-7949791181850153101?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_16_52_56-7352211286121600659?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_17_01_38-18015928301315676317?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 14s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/jvs64ns3bzgms

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #117

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/117/display/redirect?page=changes>

Changes:

[robertwb] Supporting infrastructure for dataframes on beam.

[robertwb] Basic deferred data frame implementation.

[robertwb] yapf, py2

[github] typings and docs for expressions.py

[robertwb] Minor cleanup, lint.

[github] Update Dataflow py container version (#11120)

[github] [BEAM-7923] Streaming support and pipeline pruning when instrumenting a


------------------------------------------
[...truncated 5.42 MB...]
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-13T22:17:25.218128Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-13_15_17_23-12529163397607741582'
 location: u'us-central1'
 name: u'beamapp-jenkins-0313221708-954836'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-13T22:17:25.218128Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-13_15_17_23-12529163397607741582]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_17_23-12529163397607741582?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_15_08_43-15832829594642741164 is in state JOB_STATE_DONE
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_15_17_23-12529163397607741582 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:23.217Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-13_15_17_23-12529163397607741582.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:23.217Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-13_15_17_23-12529163397607741582. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:23.217Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:27.343Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:29.385Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:29.888Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:29.940Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:29.995Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.022Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.059Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.099Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.130Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.213Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.301Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.359Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.391Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.419Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.456Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.492Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.527Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.559Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.595Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.620Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.652Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.689Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.712Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.747Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.780Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.814Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.847Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.880Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.911Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.944Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:30.978Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:31.006Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:31.084Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:31.120Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:31.165Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:31.192Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:31.225Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:31.257Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:34.018Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:34.053Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:34.091Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:17:58.509Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:18:05.413Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:18:37.800Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:18:37.842Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:23:32.441Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:24:35.505Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:24:35.582Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:24:35.628Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:24:35.673Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:24:35.711Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:26:01.737Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:26:01.809Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T22:26:01.845Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_15_17_23-12529163397607741582 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2153.865s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_50_38-16209817018660483915?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_00_00-3919483331268212802?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_08_50-1938921522235089958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_17_23-12529163397607741582?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_50_38-1853623614469731300?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_58_35-8760195671834766188?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_07_12-4123251280628741924?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_50_39-233709489417938673?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_59_09-7160011738978711267?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_07_46-11136859753395707284?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_50_39-3022002646294135072?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_00_02-15176914292944112940?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_08_18-1763864788517409835?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_50_36-12187188231329258749?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_59_11-11102929904474081568?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_07_39-15042211843480580332?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_50_38-8321634133169447663?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_59_23-10458782692994383251?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_50_40-5973683301928031201?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_59_14-5452802930403715780?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_08_07-7503094722304683893?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_50_37-2056613024272112842?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_14_59_13-6723789087836411710?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_15_08_43-15832829594642741164?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 53s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/vqr77ioz7co5w

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #116

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/116/display/redirect?page=changes>

Changes:

[ankurgoenka] [BEAM-9485] Raise error when transform urn is not implemented

[lcwik] [BEAM-9481] fix indentation

[ankurgoenka] [BEAM-9499] Sickbay test_multi_triggered_gbk_side_input for streaming


------------------------------------------
[...truncated 5.42 MB...]
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-13T20:24:05.822252Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-13_13_24_04-4515641317461515192'
 location: u'us-central1'
 name: u'beamapp-jenkins-0313202351-024301'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-13T20:24:05.822252Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-13_13_24_04-4515641317461515192]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_24_04-4515641317461515192?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:05.452Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:05.491Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:05.527Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_13_24_04-4515641317461515192 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_13_15_20-459286025012616842 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:04.787Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-13_13_24_04-4515641317461515192.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:04.787Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:04.787Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-13_13_24_04-4515641317461515192. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:07.824Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:08.675Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.137Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.159Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.217Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.246Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.275Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.309Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.334Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.402Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.484Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.526Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.556Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.587Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.613Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.633Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.660Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.684Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.712Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.740Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.765Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.789Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.815Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.840Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.863Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.893Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.927Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.949Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:09.974Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:10.001Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:10.024Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:10.051Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:10.076Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:10.098Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:10.131Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:10.157Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:10.178Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:10.199Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:12.469Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:12.499Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:12.525Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:36.013Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:24:36.417Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:25:21.379Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:25:21.400Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:30:11.350Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:31:13.701Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:31:13.747Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:31:13.774Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:31:13.803Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:31:13.828Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:32:07.221Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:32:07.259Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T20:32:07.288Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_13_24_04-4515641317461515192 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2105.706s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_12_57_37-17913471206501161994?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_06_20-5853013643715833073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_15_31-9751242216490887702?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_24_04-4515641317461515192?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_12_57_35-9677608970757006006?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_06_23-4225792011166054523?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_15_15-14936300204009527939?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_12_57_35-8967312386194438350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_05_45-5070060796226433028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_14_19-17144449446242364933?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_12_57_37-4908757361240055965?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_06_17-253877101598972536?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_14_40-11019655212657244337?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_12_57_37-4141396979458015569?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_06_25-10360693872706126649?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_12_57_36-12732985436431729518?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_06_20-16958297836204125462?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_15_20-459286025012616842?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_12_57_34-9961559500712218109?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_06_08-15138641128754176348?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_14_57-1792557171554248102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_12_57_35-718415993803198289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_06_10-5117035324997580639?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_13_14_47-15492082154850539132?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 0s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ls4b4bxspktrs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #115

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/115/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-2939] Implement interfaces and concrete watermark estimators


------------------------------------------
[...truncated 5.43 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-13T18:29:50.902684Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-13_11_29_49-17117919899503904848'
 location: u'us-central1'
 name: u'beamapp-jenkins-0313182934-135685'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-13T18:29:50.902684Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-13_11_29_49-17117919899503904848]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_29_49-17117919899503904848?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_11_20_33-1525101494643138328 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_11_20_34-10047996552588742807 is in state JOB_STATE_DONE
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_11_29_49-17117919899503904848 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:49.843Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-13_11_29_49-17117919899503904848. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:49.843Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:49.843Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-13_11_29_49-17117919899503904848.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:53.369Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:54.402Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.016Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.051Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.187Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.224Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.257Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.303Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.343Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.453Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.547Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.607Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.652Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.677Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.709Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.740Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.800Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.850Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.895Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.931Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.968Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:55.999Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.023Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.061Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.091Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.132Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.167Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.207Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.250Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.287Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.326Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.362Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.402Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.438Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.478Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.520Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.558Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:56.596Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:59.118Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:59.151Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:29:59.195Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:30:07.597Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:30:22.579Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:30:32.630Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:30:32.679Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:30:32.715Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_11_21_20-5478329517200257508 is in state JOB_STATE_DONE
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:31:06.188Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:31:06.277Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:35:57.843Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:38:00.597Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:38:00.659Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:38:00.753Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:38:01.009Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:38:01.044Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:38:52.369Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:38:52.415Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T18:38:52.497Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_11_29_49-17117919899503904848 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2143.378s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_03_42-3901439655185594536?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_12_51-15899763584026059262?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_21_20-15893706383582691475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_29_49-17117919899503904848?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_03_40-17313422923537848449?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_11_53-13138096197039326723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_20_42-4698349634158864713?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_03_41-742495597206101706?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_11_03-15396282958939640673?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_20_34-10047996552588742807?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_03_42-17100363348734509069?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_11_49-12284629225038243289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_21_20-5478329517200257508?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_03_42-7663887485881684829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_12_02-8604433209031850256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_20_33-1525101494643138328?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_03_38-3754707157358779698?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_12_51-3914499929875150829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_03_41-1935781362858733869?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_11_40-10854781839377503006?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_20_03-1070231795923647374?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_03_41-1905266987900168260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_11_52-15884943440939237514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_11_20_39-15749986079148595583?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 17s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/rb24of5g2v5ee

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #114

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/114/display/redirect>

Changes:


------------------------------------------
[...truncated 5.42 MB...]
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-13T13:08:14.931436Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-13_06_08_13-4437598118502375176'
 location: u'us-central1'
 name: u'beamapp-jenkins-0313130757-647632'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-13T13:08:14.931436Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-13_06_08_13-4437598118502375176]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_06_08_13-4437598118502375176?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:24.736Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:24.775Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:24.804Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_06_08_13-4437598118502375176 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_06_00_16-15229305549948383031 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:13.847Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-13_06_08_13-4437598118502375176. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:13.847Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-13_06_08_13-4437598118502375176.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:13.847Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:17.726Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:18.585Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.146Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.183Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.251Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.285Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.319Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.358Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.400Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.518Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.612Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.680Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.717Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.743Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.782Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.811Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.847Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.887Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.922Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.946Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:19.982Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.022Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.051Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.084Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.121Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.144Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.185Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.223Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.254Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.277Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.345Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.383Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.417Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.453Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.497Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.533Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.570Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:20.601Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:24.593Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:24.627Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:24.653Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:39.493Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:39.628Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:39.668Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:42.302Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:42.344Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:42.376Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:40.143Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:40.180Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:40.221Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_06_00_18-4120484805235893865 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_06_00_33-16384780287477757791 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_06_00_24-14239180206155127550 is in state JOB_STATE_DONE
test_empty_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:51.533Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:08:51.732Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:09:31.698Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:09:31.737Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:14:23.137Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:15:25.624Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:15:25.680Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:15:25.799Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:15:25.841Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:15:25.867Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:16:16.694Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:16:16.730Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T13:16:16.766Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_06_08_13-4437598118502375176 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 1980.879s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_44_08-3035984995632228554?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_51_42-6543195379197979003?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_06_00_37-5754587041038256894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_06_08_13-4437598118502375176?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_44_01-1227662321037558580?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_51_37-15151679181942359029?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_06_00_16-15229305549948383031?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_44_02-2170922351708768947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_51_32-10113074494275132618?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_06_00_24-14239180206155127550?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_44_09-334573342448531024?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_51_43-18261286372411926133?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_44_06-9819802381605127253?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_52_46-8253996549772828082?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_06_00_33-16384780287477757791?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_44_01-7783151481341546020?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_51_41-6826635729257171950?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_06_00_23-12047582885754236355?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_44_10-17903362858644421733?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_51_43-4114893360092061211?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_06_00_18-4120484805235893865?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_44_04-2418166788964857267?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_51_35-11692953869325605290?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_05_59_02-116086732658870157?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 13m 31s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/5z4wtnqa6nfbu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #113

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/113/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-9299-PR]Upgrade Flink Runner 1.8x to 1.8.3 and 1.9x to 1.9.2


------------------------------------------
[...truncated 5.50 MB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-13T11:33:41.821634Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-13_04_33_40-8585301938109057509'
 location: u'us-central1'
 name: u'beamapp-jenkins-0313113325-687822'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-13T11:33:41.821634Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-13_04_33_40-8585301938109057509]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_33_40-8585301938109057509?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_04_33_40-8585301938109057509 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:40.547Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:40.547Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-13_04_33_40-8585301938109057509. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:40.547Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-13_04_33_40-8585301938109057509.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:44.193Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:45.415Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:45.975Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.007Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.069Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.094Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.114Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.146Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.167Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.246Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.329Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.378Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.407Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.428Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.458Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.479Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.504Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.524Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.548Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.570Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.595Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.615Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.642Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.663Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.687Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.709Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.732Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.753Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.780Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.805Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.834Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.856Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.887Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.915Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.949Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.969Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:46.992Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:47.024Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:49.240Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:49.268Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:33:49.298Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:34:04.459Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:34:14.994Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:34:46.532Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:34:46.564Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:39:48.164Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:39:50.495Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:39:50.567Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:39:50.586Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:39:50.611Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:39:50.640Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:40:52.308Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:40:52.350Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T11:40:52.381Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_04_33_40-8585301938109057509 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2064.975s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_07_01-1699556775118509359?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_15_56-17650266084958615259?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_24_41-11226794124176041643?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_33_40-8585301938109057509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_06_57-4722093169262789453?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_15_11-14580307765099028791?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_23_41-8641893534093369188?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_07_00-15976588634021274327?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_14_36-5925454656863778145?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_22_13-2946485162647146331?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_06_59-10839113083629835689?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_14_54-3611866522922548025?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_22_22-22438749882425747?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_06_59-6982129780736857541?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_16_14-795551151688227918?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_06_58-14861369847771142088?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_15_53-1458527679051591768?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_24_26-15671287250935713258?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_06_57-5697854339057888369?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_14_46-5285600895751068639?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_23_22-1024962944964423113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_06_57-6908357562388890291?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_14_58-7488402911020312466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_04_23_31-13751777447819516861?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 14m 14s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/jqrjrcre7ksoq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #112

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/112/display/redirect?page=changes>

Changes:

[ankurgoenka] [BEAM-9287] disable validates runner test which uses teststreams for


------------------------------------------
[...truncated 5.51 MB...]
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-13T10:19:24.053572Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-13_03_19_22-510630318562111287'
 location: u'us-central1'
 name: u'beamapp-jenkins-0313101905-423753'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-13T10:19:24.053572Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-13_03_19_22-510630318562111287]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_19_22-510630318562111287?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_03_11_01-4655411559137167514 is in state JOB_STATE_DONE
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_03_19_22-510630318562111287 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:22.842Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-13_03_19_22-510630318562111287.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:22.842Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-13_03_19_22-510630318562111287. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:22.842Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:27.141Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:27.980Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:28.481Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:28.516Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:28.589Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:28.634Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:28.667Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:28.708Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:28.744Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:28.857Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:28.961Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.028Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.059Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.083Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.124Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.148Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.188Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.221Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.251Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.276Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.307Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.340Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.372Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.413Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.438Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.464Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.495Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.524Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.550Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.573Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.604Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.636Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.662Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.693Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.732Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.765Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.790Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:29.812Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:32.162Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:32.242Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:32.361Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:54.499Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:19:55.041Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:20:27.781Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:20:27.807Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:25:31.094Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:25:35.118Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:25:35.168Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:25:35.201Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:25:35.231Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:25:35.261Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:26:16.420Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:26:16.545Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T10:26:16.584Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_03_19_22-510630318562111287 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2015.161s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_02_53_14-16353659728417389944?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_02_25-8542393428043234699?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_11_26-17823438224510598761?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_19_22-510630318562111287?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_02_53_12-13844206430681370443?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_02_04-11531553329746129232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_10_44-627317024222038413?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_02_53_13-5038918205545089197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_01_09-6490508005515018294?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_09_40-7123134554250278236?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_02_53_14-14706152010241085212?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_02_40-6707943054012919708?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_02_53_11-3011075858605668933?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_01_55-426160016943924426?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_10_41-6996321866636572511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_02_53_15-15697128544767438711?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_02_17-1436009740415456305?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_11_01-4655411559137167514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_02_53_11-6359873071981428874?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_01_09-996594348857044463?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_08_41-1946665177137964941?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_02_53_14-11681324415417568429?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_02_10-17022168825396711438?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_03_10_45-1683143800635575575?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 14m 17s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ftvtgijt7atry

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #111

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/111/display/redirect>

Changes:


------------------------------------------
[...truncated 5.44 MB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-13T07:17:06.541355Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-13_00_17_05-4411669371199816042'
 location: u'us-central1'
 name: u'beamapp-jenkins-0313071648-004331'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-13T07:17:06.541355Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-13_00_17_05-4411669371199816042]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_00_17_05-4411669371199816042?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_00_17_05-4411669371199816042 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:05.287Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:05.287Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-13_00_17_05-4411669371199816042. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:05.288Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-13_00_17_05-4411669371199816042.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:11.586Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:13.152Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:13.783Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:13.830Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:13.924Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:13.964Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:13.996Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.046Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.077Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.189Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.293Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.367Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.414Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.446Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.486Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.526Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.566Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.599Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.641Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.679Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.720Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.757Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.798Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.831Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.860Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.891Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.928Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:14.968Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.009Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.045Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.078Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.115Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.156Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.192Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.249Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.287Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.322Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:15.365Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:17.625Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:17.665Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:17.698Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:41.746Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:17:47.414Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:18:09.274Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:18:09.300Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:23:16.648Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:24:20.810Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:24:20.860Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:24:20.885Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:24:21.032Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:24:21.065Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:25:27.395Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:25:27.444Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T07:25:27.500Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-13_00_17_05-4411669371199816042 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2176.478s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_49_48-5699611526331167379?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_58_57-5019073673076062816?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_00_07_28-18107344609388819820?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_00_17_05-4411669371199816042?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_49_47-13866117293576128217?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_58_52-16607850894211136286?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_49_48-1989996661322917072?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_57_23-10207282644603091664?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_00_06_03-13350464594969168515?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_49_48-2194365903376331992?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_57_34-4113099328107917363?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_00_06_09-13721309567602030890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_49_48-2824784164647225169?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_58_48-3324107180288093683?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_00_07_29-2044697436263619051?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_49_45-17154192814430042810?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_58_39-374969468259359975?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_00_07_09-164839137713459721?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_49_48-17443104367414448148?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_57_49-12884469585157731832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_00_06_19-11984802028494923530?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_49_46-5535515952369763055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_23_58_37-7026182257957535789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-13_00_07_13-7382505071766232003?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 56s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/jsojscfh5dnyo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #110

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/110/display/redirect?page=changes>

Changes:

[github] [BEAM-8335] Implemented Capture Size limitation (#11050)

[github] [BEAM-9294] Move RowJsonException out of RowJsonSerializer (#11102)

[github] Merge pull request #11046: [BEAM-9442] Properly handle nullable fields


------------------------------------------
[...truncated 5.43 MB...]
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-13T02:20:20.059541Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_19_20_18-18018379296390129470'
 location: u'us-central1'
 name: u'beamapp-jenkins-0313022002-862090'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-13T02:20:20.059541Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-12_19_20_18-18018379296390129470]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_20_18-18018379296390129470?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_19_20_18-18018379296390129470 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:18.868Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-12_19_20_18-18018379296390129470. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:18.868Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:18.868Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-12_19_20_18-18018379296390129470.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:22.074Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:22.891Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.501Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.535Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.593Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.628Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.659Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.697Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.736Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.825Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.922Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:23.995Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.031Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.086Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.108Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.146Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.183Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.222Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.256Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.287Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.322Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.353Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.383Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.411Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.437Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.461Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.487Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.516Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.554Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.590Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.622Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.654Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.690Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.722Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.765Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.805Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.839Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:24.872Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:27.116Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:27.145Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:27.184Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:20:42.082Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:21:03.645Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:21:04.435Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:21:04.485Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:21:04.534Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_19_11_52-11299810707816376510 is in state JOB_STATE_DONE
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:21:34.708Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:21:34.738Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:26:26.083Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:27:30.960Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:27:31.028Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:27:31.057Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:27:31.092Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:27:31.123Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:28:15.213Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:28:15.255Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T02:28:15.297Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_19_20_18-18018379296390129470 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2055.206s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_18_54_31-16377198363910840219?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_03_09-6060668747994688068?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_11_49-17370649684916895760?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_20_18-18018379296390129470?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_18_54_33-9538340890550287344?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_03_10-8127936143142248689?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_11_52-11299810707816376510?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_18_54_33-1014970021361167096?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_01_46-12633911364901832113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_10_26-13248146292844656930?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_18_54_33-2046625007062860124?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_03_09-7537122789734839899?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_11_43-16451804695202032888?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_18_54_34-17493491474064718888?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_03_10-16056052174742536582?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_18_54_33-10598697111557847968?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_03_10-14566076635975393656?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_11_44-7717245547365116466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_18_54_34-10392395961230320508?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_03_15-1100446020140329815?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_11_39-12495632466234499793?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_18_54_32-16147574321465487842?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_03_12-1094862212169269503?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_19_11_47-13654588294949137529?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 13m 49s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ben7jqlu6tkfk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #109

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/109/display/redirect?page=changes>

Changes:

[github] Merge pull request #11103 from [BEAM-9494] Reifying outputs from BQ file


------------------------------------------
[...truncated 5.43 MB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-13T00:06:45.730285Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_17_06_44-3429583185355603789'
 location: u'us-central1'
 name: u'beamapp-jenkins-0313000627-987850'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-13T00:06:45.730285Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-12_17_06_44-3429583185355603789]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_17_06_44-3429583185355603789?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_17_06_44-3429583185355603789 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:44.636Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:44.636Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-12_17_06_44-3429583185355603789.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:44.636Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-12_17_06_44-3429583185355603789. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:48.047Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:50.137Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:50.845Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:50.873Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:50.937Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:50.975Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.009Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.051Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.085Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.182Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.284Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.349Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.370Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.404Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.431Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.467Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.500Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.525Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.549Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.573Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.604Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.639Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.676Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.702Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.730Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.760Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.788Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.814Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.839Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.870Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.908Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.946Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:51.973Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:52Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:52.045Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:52.077Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:52.099Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:52.123Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:56.911Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:56.950Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:06:56.986Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:07:01.016Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:07:27.224Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:07:56.945Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:07:57.010Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:12:55.861Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:13:58.630Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:13:58.694Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:13:58.724Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:13:58.765Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:13:58.794Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:14:52.109Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:14:52.148Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-13T00:14:52.183Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_17_06_44-3429583185355603789 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2184.922s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_39_00-15072944538447319338?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_48_29-9360797898151537629?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_58_04-6799355815615478632?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_17_06_44-3429583185355603789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_38_57-5138690285209056116?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_48_27-10868410131003587215?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_57_05-3502858364281902692?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_39_00-1321521417522855947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_46_13-9352644729086966796?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_54_57-4359278144980359195?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_38_59-12602923167935607718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_48_21-12803557539337072423?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_38_56-14460437013879769479?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_47_28-8254111234689862457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_55_53-12378923803715909238?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_38_58-16131351003038497086?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_47_48-12501361160063246239?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_57_27-15070028710589333082?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_38_57-10313824027154513782?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_47_29-14766418237215403610?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_57_08-2293437337209682680?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_39_00-18404618127921070183?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_47_47-1306433184071903002?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_16_57_13-11677761820968981429?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 15s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/gn3j67lwvlqlk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #108

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/108/display/redirect?page=changes>

Changes:

[heejong] [BEAM-9056] Staging artifacts from environment


------------------------------------------
[...truncated 5.43 MB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T22:23:59.622906Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_15_23_58-10019505726393270602'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312222342-176035'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T22:23:59.622906Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-12_15_23_58-10019505726393270602]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_23_58-10019505726393270602?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_15_23_58-10019505726393270602 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:23:58.575Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-12_15_23_58-10019505726393270602. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:23:58.575Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-12_15_23_58-10019505726393270602.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:23:58.575Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:01.735Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:02.565Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.171Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.237Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.332Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.369Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.404Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.449Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.479Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.556Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.657Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.712Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.748Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.783Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.814Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.854Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.891Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.914Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.948Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:03.985Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.025Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.047Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.076Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.113Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.147Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.183Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.208Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.236Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.306Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.330Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.365Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.400Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.437Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.474Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.511Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.540Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.575Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:04.601Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:06.818Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:06.854Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:06.878Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:27Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:24:36.744Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:25:13.764Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:25:13.851Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:30:05.865Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:31:13.783Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:31:13.848Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:31:13.879Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:31:13.920Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:31:14.003Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:32:15.390Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:32:15.432Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T22:32:15.466Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_15_23_58-10019505726393270602 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2130.669s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_14_57_17-7897042910184908124?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_05_52-3516119088596170288?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_15_28-4588500856721227273?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_23_58-10019505726393270602?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_14_57_15-10464028231429652528?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_05_52-5737714843483338848?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_14_32-16725800606445571395?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_14_57_21-10647577396764848844?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_05_17-7210963771131842894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_14_11-9869699105545791881?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_14_57_17-16818514647597754152?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_05_55-11992263056963081525?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_14_34-18424612428777716926?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_14_57_14-5144566588947960640?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_06_44-15332857810764675443?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_15_22-3399322326417345208?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_14_57_17-831416405062348972?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_06_58-10404457101705634125?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_14_57_19-5460091146790276452?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_05_55-14700843611989158231?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_14_36-11290299696766060289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_14_57_15-3268168189573342023?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_05_48-12033963527258663028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_15_14_23-1905446445096141320?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 5s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/mp2i65bmisuvi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #107

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/107/display/redirect?page=changes>

Changes:

[github] Verify schema early in ToJson and JsonToRow (#11105)


------------------------------------------
[...truncated 5.49 MB...]
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T19:57:29.742183Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_12_57_28-6276937492020889275'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312195702-544994'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T19:57:29.742183Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-12_12_57_28-6276937492020889275]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_57_28-6276937492020889275?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_12_57_28-6276937492020889275 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:40.334Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:40.374Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:40.409Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:28.563Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-12_12_57_28-6276937492020889275.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:28.563Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-12_12_57_28-6276937492020889275. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:28.563Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:32.382Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:33.299Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.282Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.345Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.381Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.421Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.459Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.490Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.590Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.684Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.756Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.792Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.830Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.867Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.902Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.941Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:34.973Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.012Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.044Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.080Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.128Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.159Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.201Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.243Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.280Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.321Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.353Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.390Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.426Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.467Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.504Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.539Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.576Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.618Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.658Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.694Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:35.728Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:38.815Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:38.860Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:38.903Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_12_48_15-10840328215403298130 is in state JOB_STATE_DONE
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:57:58.793Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:58:28.260Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:59:28.810Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T19:59:28.835Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T20:03:37.797Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T20:06:42.806Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T20:06:42.872Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T20:06:42.963Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T20:06:43.014Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T20:06:43.085Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T20:07:35.637Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T20:07:35.681Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T20:07:35.722Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_12_57_28-6276937492020889275 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2261.889s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_30_25-18133836967295425105?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_39_16-17317686971212121719?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_48_19-5332942236236145792?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_57_28-6276937492020889275?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_30_26-7471275498463824084?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_38_22-10461007248439994614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_47_02-543639618830854326?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_30_26-7428906081768737725?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_40_08-12720843540515768764?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_30_26-13411880351992424404?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_37_55-1766030739584594282?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_48_15-10840328215403298130?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_30_25-17958300749462880620?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_39_14-4864085991330339711?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_48_09-1806544415371795380?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_30_26-16860661338582335233?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_38_58-2064175854463348920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_47_46-828235699134272059?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_30_27-16759110425987034978?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_39_13-10536643932454288957?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_47_45-12406421607541428267?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_30_25-1754963796642753255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_39_07-7376758281999853029?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_12_47_46-5690678341668568279?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 3s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/dwvw66qct7i4i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #106

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/106/display/redirect?page=changes>

Changes:

[github] Fixing apache_beam.io.gcp.bigquery_test:PubSubBigQueryIT. at head


------------------------------------------
[...truncated 5.44 MB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T17:56:56.170498Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_10_56_55-11149883925904266893'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312175638-294033'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T17:56:56.170498Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-12_10_56_55-11149883925904266893]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_56_55-11149883925904266893?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_10_56_55-11149883925904266893 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:56:55.074Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-12_10_56_55-11149883925904266893.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:56:55.074Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-12_10_56_55-11149883925904266893. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:56:55.074Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:01.949Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:02.555Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.081Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.155Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.229Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.258Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.283Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.328Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.368Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.479Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.599Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.651Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.682Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.716Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.748Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.778Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.800Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.839Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.875Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.914Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.942Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:03.975Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.009Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.044Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.074Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.110Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.149Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.181Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.225Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.258Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.292Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.317Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.361Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.394Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.433Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.463Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.497Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:04.531Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:21.934Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:35.399Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:35.446Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:57:35.493Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:58:04.719Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:58:30.500Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T17:58:30.534Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:03:05.741Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.419Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.475Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.500Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.541Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:04:37.566Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:05:22.319Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:05:22.528Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T18:05:22.599Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_10_56_55-11149883925904266893 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2182.606s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_38-12902448343493459045?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_39_04-13755921482621391046?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_48_14-14199041188818676489?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_56_55-11149883925904266893?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_37-7496866972695785782?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_17-9223809253366558464?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_47_14-12556406680748801646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_37-11322849103390080251?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_09-5106433121096340210?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_46_44-974645006389504310?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_38-10300893707782139206?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_37_40-18132827848236419716?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_46_31-7061421517999968259?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_39-14167326813525540096?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_39_14-11936083830124716780?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_36-10261436357990232592?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_04-3068100152836446232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_46_40-2442995916299268955?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_39-16080148491383674731?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_16-14295870522617893294?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_47_50-18366059412314013485?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_29_37-17267459930739234312?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_38_18-7890722251611419122?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_10_46_59-12366168715647852489?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 40s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/fpjwrrqvthc7y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #105

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/105/display/redirect>

Changes:


------------------------------------------
[...truncated 5.43 MB...]
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T13:11:30.461822Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_06_11_29-6916819049968599719'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312131112-799499'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T13:11:30.461822Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-12_06_11_29-6916819049968599719]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_06_11_29-6916819049968599719?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_06_11_29-6916819049968599719 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:42.301Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:42.361Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:42.403Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:29.444Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:29.444Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-12_06_11_29-6916819049968599719. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:29.444Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-12_06_11_29-6916819049968599719.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:32.953Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:33.764Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.353Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.417Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.491Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.518Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.558Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.600Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.630Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.728Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.821Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.892Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.929Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:34.966Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.002Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.027Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.063Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.086Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.129Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.163Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.195Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.230Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.261Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.296Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.332Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.373Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.406Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.448Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.472Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.500Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.535Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.572Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.596Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.636Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.687Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.720Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.752Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:35.788Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:38.232Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:38.273Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:38.315Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_06_02_42-3984953372180190075 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:11:47.881Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:12:03.931Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:12:43.460Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:12:43.489Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:17:37.190Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:18:39.469Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:18:39.508Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:18:39.538Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:18:39.575Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:18:39.602Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:19:29.779Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:19:29.841Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T13:19:29.896Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_06_11_29-6916819049968599719 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2072.818s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_45_32-12628975238209806964?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_54_22-6126870558210763281?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_06_02_53-7397010516069772870?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_06_11_29-6916819049968599719?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_45_32-7133055601475320671?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_54_03-24177359760468471?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_06_02_42-3984953372180190075?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_45_33-4847374305212467733?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_52_47-14544584027480663208?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_06_01_22-371378962732161766?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_45_32-13900354122509783963?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_54_22-8358130878398343884?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_45_32-14863444004401195646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_54_01-10614461792892386010?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_06_01_47-14238723966059650480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_45_31-15556176281686312324?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_53_03-8051329680585741059?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_06_01_27-4681138307135629498?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_45_33-14217694524863463514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_53_59-16420217526963460584?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_06_02_25-1485919235831103055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_45_31-18233602476434404108?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_05_52_52-18399745631792902283?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_06_01_37-1040849567703185537?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 14m 36s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/hy4vtu7lcwpwm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #104

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/104/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-9298] Drop support for Flink 1.7


------------------------------------------
[...truncated 5.45 MB...]
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T11:38:07.786166Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_04_38_06-3690159037392020067'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312113746-940960'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T11:38:07.786166Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-12_04_38_06-3690159037392020067]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_38_06-3690159037392020067?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.255Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.308Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.353Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_04_38_06-3690159037392020067 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_04_29_10-5158414438595464012 is in state JOB_STATE_DONE
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:06.545Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:06.545Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-12_04_38_06-3690159037392020067.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:06.545Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-12_04_38_06-3690159037392020067. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:10.400Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:11.262Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:11.960Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:11.986Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.046Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.083Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.114Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.156Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.196Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.295Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:12.366Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:13.761Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:13.796Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:13.822Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:13.845Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:13.868Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:13.900Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:13.926Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:13.956Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:13.993Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.015Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.047Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.079Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.105Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.134Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.166Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.194Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.222Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.249Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.273Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.303Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.321Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.350Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.376Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.412Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.441Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.472Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:14.504Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:17.710Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:17.743Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:17.771Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:40.725Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:38:52.300Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:39:19.550Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:39:19.573Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:44:16.805Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:45:21.347Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:45:21.404Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:45:21.496Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:45:21.542Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:45:21.586Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:46:21.381Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:46:21.423Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T11:46:21.469Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_04_38_06-3690159037392020067 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2143.527s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_11_12-15577661244565228160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_19_40-3382750547385807637?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_29_28-17723880631552747858?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_38_06-3690159037392020067?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_11_14-18014338434405275176?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_20_10-17008749032166114305?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_28_48-12500363280715083352?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_11_12-10514056670221488122?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_20_17-11016613728684635009?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_11_13-16105276571010491569?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_18_48-4296774255486193682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_27_20-9121142558257969874?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_11_15-5938226887792218819?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_18_38-11748332034595514844?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_27_20-18035497007313102884?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_11_11-625116029368978519?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_19_04-17226861649751938643?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_27_35-17068311317955685165?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_11_14-11566368464146181622?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_20_29-17372806445857266010?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_29_10-5158414438595464012?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_11_13-13745396275421613237?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_20_14-1203740932025993928?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_04_28_53-5796184230666968162?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 15m 15s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/s5zbmpar3gzo2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #103

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/103/display/redirect?page=changes>

Changes:

[rohde.samuel] [BEAM-8335] Modify the StreamingCache to subclass the CacheManager


------------------------------------------
[...truncated 5.44 MB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T08:47:34.699601Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_01_47_33-1859011866094408043'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312084717-867968'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T08:47:34.699601Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-12_01_47_33-1859011866094408043]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_47_33-1859011866094408043?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_01_47_33-1859011866094408043 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:33.555Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-12_01_47_33-1859011866094408043.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:33.555Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-12_01_47_33-1859011866094408043. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:33.555Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:37.342Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:38.365Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:38.952Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:38.995Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.078Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.121Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.164Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.212Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.252Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.415Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.549Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.621Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.664Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.697Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.731Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.765Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.807Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.846Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.881Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.928Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:39.965Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.008Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.063Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.088Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.134Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.173Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.205Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.245Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.281Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.308Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.354Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.390Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.431Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.470Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.532Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.570Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.622Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:40.658Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:43.020Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:43.067Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:47:43.114Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:48:02.915Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:48:04.860Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:48:33.348Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:48:33.375Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:53:42.061Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:53:44.635Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:53:44.760Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:53:44.803Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:53:44.861Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:53:44.925Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:54:30.545Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:54:30.586Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T08:54:30.628Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_01_47_33-1859011866094408043 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2047.135s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_20_57-10573014744612094976?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_29_48-6998868681970237446?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_38_38-13710983622553953668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_47_33-1859011866094408043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_20_54-8086749251525720266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_29_20-8641387428251223839?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_37_15-15701997879340745341?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_20_55-13015190919402628986?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_28_29-3085326280156860785?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_36_59-11711204518558240256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_20_55-5137424794557957198?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_29_46-6150586858834224314?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_38_34-13646047362805024092?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_20_56-16963255882514397254?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_28_38-251183964800608541?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_36_28-13815585868587440400?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_20_53-2851088863342560801?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_28_28-14022168894815640889?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_37_13-10890667530057206253?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_20_55-15794939028315492013?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_30_20-5787237879379299006?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_20_55-3229559686018822850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_28_36-11304900364354220258?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_01_36_21-9701375984895996861?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 13m 31s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/kb6wku3nogsqc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #102

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/102/display/redirect>

Changes:


------------------------------------------
[...truncated 5.43 MB...]
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T07:17:27.714228Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-12_00_17_26-859113861906087616'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312071710-649965'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T07:17:27.714228Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-12_00_17_26-859113861906087616]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_00_17_26-859113861906087616?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_00_17_26-859113861906087616 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:26.663Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-12_00_17_26-859113861906087616. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:26.663Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-12_00_17_26-859113861906087616.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:26.663Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:29.695Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:30.353Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:30.968Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.002Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.078Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.116Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.155Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.204Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.235Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.363Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.468Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.526Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.561Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.600Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.628Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.667Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.707Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.745Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.783Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.831Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.857Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.896Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.939Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:31.978Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.007Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.042Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.087Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.122Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.158Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.200Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.243Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.283Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.315Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.338Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.372Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.400Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.437Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:32.478Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:34.781Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:34.816Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:17:34.859Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:18:04.879Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:18:06.005Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:18:51.844Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:18:51.897Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:23:33.844Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:25:36.505Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:25:36.560Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:25:36.602Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:25:36.650Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:25:36.684Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:26:29.958Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:26:30.010Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T07:26:30.057Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-12_00_17_26-859113861906087616 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2246.732s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_49_34-1543096047610737141?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_57_11-1460070238120667019?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_00_08_52-8442063486193473941?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_00_17_26-859113861906087616?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_49_34-10946731893453423700?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_57_11-4310038429517633031?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_00_05_48-5177282873702629057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_49_34-15069420912716664408?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_57_16-17829440530031429921?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_00_05_55-4053132369515432538?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_49_35-1182736676214503076?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_58_20-2706347290435828389?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_00_07_04-17285756055592194254?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_49_34-6214623730199117888?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_58_08-2530235908515080374?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_00_08_51-2350610871358721141?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_49_36-1957135447251980294?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_58_17-9765057848008992937?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_00_07_59-1651598457209768149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_49_34-3456385365533752375?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_58_11-10561252484210823083?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_49_32-4181567552453422264?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_23_57_14-16252458624605301455?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-12_00_05_55-6551781404828583531?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 9s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/xseuy4ynqcsuk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #101

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/101/display/redirect?page=changes>

Changes:

[mxm] [BEAM-9474] Improve robustness of BundleFactory and ProcessEnvironment


------------------------------------------
[...truncated 5.51 MB...]
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T02:42:23.801741Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-11_19_42_22-11308770417710048200'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312024205-440862'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T02:42:23.801741Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-11_19_42_22-11308770417710048200]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_42_22-11308770417710048200?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_19_42_22-11308770417710048200 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:22.724Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-11_19_42_22-11308770417710048200. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:22.724Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:22.724Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-11_19_42_22-11308770417710048200.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:26.585Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:27.462Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.079Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.120Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.203Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.249Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.285Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.328Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.357Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.463Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.575Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.640Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.675Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.704Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.742Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.787Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.823Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.862Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.888Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.923Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.950Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:28.980Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.015Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.052Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.118Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.152Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.185Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.223Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.257Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.291Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.327Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.366Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.403Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.434Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.480Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.513Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.555Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:29.596Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:31.922Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:31.959Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:31.995Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:42:58.658Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:43:05.822Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:43:36.740Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:43:36.786Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:48:30.817Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:49:35.499Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:49:35.566Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:49:35.678Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:49:35.788Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:49:35.845Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:50:46.006Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:50:46.063Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T02:50:46.099Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_19_42_22-11308770417710048200 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2153.606s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_15_24-8709278780649439465?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_24_02-7785788117615094528?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_33_46-13640899963229034597?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_42_22-11308770417710048200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_15_24-16763255028053624494?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_24_05-6126263769286722238?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_32_50-11737335005070177224?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_15_24-2519723304447254140?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_22_55-3356314097218950389?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_31_25-199299052403322415?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_15_24-11858129380000985663?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_24_04-11121665161414886681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_33_41-15329110569899889824?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_15_22-16292738192414604579?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_24_00-302088405998956970?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_33_47-320809610773230502?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_15_23-11344630752505138325?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_24_04-4973292926737717402?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_32_34-1078016815081250059?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_15_23-14132782804785679111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_24_15-8065502259185370774?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_32_57-998078885811379706?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_15_25-9184603957406396786?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_19_24_26-14517001206302641696?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 15m 2s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ngq5lseoqwv6a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #100

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/100/display/redirect?page=changes>

Changes:

[ankurgoenka] [BEAM-9402] Remove options overwrite

[github] [BEAM-7815] update MemoryReporter comments about using guppy3 (#11073)


------------------------------------------
[...truncated 5.45 MB...]
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-12T00:55:43.412674Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-11_17_55_42-16215057435310784178'
 location: u'us-central1'
 name: u'beamapp-jenkins-0312005513-931124'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-12T00:55:43.412674Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-11_17_55_42-16215057435310784178]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_55_42-16215057435310784178?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_17_55_42-16215057435310784178 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:42.266Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:42.266Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-11_17_55_42-16215057435310784178.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:42.266Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-11_17_55_42-16215057435310784178. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:45.777Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:46.381Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:46.916Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:46.951Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.008Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.032Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.057Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.093Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.126Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.299Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.398Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.676Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.734Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.798Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.847Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.886Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.930Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.960Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:47.990Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.013Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.046Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.071Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.094Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.125Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.151Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.179Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.208Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.232Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.260Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.285Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.309Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.333Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.362Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.386Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.420Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.447Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.470Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:48.497Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:52.770Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:52.800Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:55:52.829Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:56:16.143Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:56:25.433Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:56:49.073Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T00:56:49.101Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T01:01:51.704Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T01:02:55.345Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T01:02:55.409Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T01:02:55.442Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T01:02:55.477Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T01:02:55.516Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T01:03:55.968Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T01:03:56.026Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-12T01:03:56.071Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_17_55_42-16215057435310784178 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

======================================================================
ERROR: Test a GBK sideinput, with multiple triggering.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/sideinputs_test.py",> line 406, in test_multi_triggered_gbk_side_input
    p.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run
    False if self.not_use_test_runner_api else test_runner_api))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 495, in run
    self._options).run(False)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 508, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline
    self).run_pipeline(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 536, in run_pipeline
    self.visit_transforms(pipeline, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 224, in visit_transforms
    pipeline.visit(RunVisitor(self))
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 545, in visit
    self._root_transform().visit(visitor, self, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1033, in visit
    part.visit(visitor, pipeline, visited)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 1036, in visit
    visitor.visit_transform(self)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 219, in visit_transform
    self.runner.run_transform(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/runner.py",> line 246, in run_transform
    return m(transform_node, options)
  File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 957, in run_ParDo
    PropertyNames.STEP_NAME: input_step.proto.name,
AttributeError: 'list' object has no attribute 'proto'
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones
apache_beam.runners.runner: ERROR: Error while visiting Main windowInto
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2163.051s

FAILED (errors=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_28_41-17546816029974206947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_36_43-12252370406024729934?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_45_16-5674532668723543872?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_28_38-5096561765808253577?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_38_13-11641105076148050308?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_46_48-9425987496838605912?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_55_42-16215057435310784178?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_28_38-15992892056683299611?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_38_14-13670287531877688255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_46_47-2925413651783595049?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_28_38-6623745424211322762?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_37_35-4272590186009497829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_46_16-4671714876517820848?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_28_40-1731996473143209885?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_37_23-16962108813382933582?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_45_53-8034307397036121302?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_28_40-16357855822625081602?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_37_36-15384755716520457041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_46_11-12995728597158078612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_28_41-9899390138982033199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_37_28-1878474237786625759?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_46_02-6196649768370986763?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_28_40-12871680364312007485?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_17_38_15-4499394763067054792?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 22m 53s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/vslpbxrvn4tcm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #99

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/99/display/redirect?page=changes>

Changes:

[github] Update Python roadmap for 2.7 eol


------------------------------------------
[...truncated 5.44 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-11T23:06:59.245833Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-11_16_06_58-2762037998641033001'
 location: u'us-central1'
 name: u'beamapp-jenkins-0311230643-889630'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-11T23:06:59.245833Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-11_16_06_58-2762037998641033001]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_16_06_58-2762037998641033001?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_16_06_58-2762037998641033001 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:06:58.083Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:06:58.083Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-11_16_06_58-2762037998641033001. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:06:58.083Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-11_16_06_58-2762037998641033001.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:03.485Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:04.325Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:04.958Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:04.984Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.051Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.088Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.120Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.162Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.193Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.283Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.380Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.440Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.473Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.510Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.536Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.612Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.678Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.725Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.776Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.805Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.839Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.876Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.913Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.946Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:05.981Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.018Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.076Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.119Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.177Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.207Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.287Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.350Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.402Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.461Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.562Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.611Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.654Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:06.690Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:12.298Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:12.339Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:12.369Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:32.756Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:07:42.878Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:08:18.459Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:08:18.494Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:13:11.249Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:14:15.938Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:14:15.998Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:14:16.033Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:14:16.067Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:14:16.092Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:15:17.446Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:15:17.498Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T23:15:17.522Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_16_06_58-2762037998641033001 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2122.241s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_40_30-1211370213363668049?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_49_06-13962564986506731601?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_58_30-5999701434345746041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_16_06_58-2762037998641033001?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_40_29-9974982908526175951?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_49_09-4770682806631250165?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_57_58-1677021617640981745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_40_29-7001233413257548712?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_48_20-18220875530280394109?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_57_15-2097140658265532828?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_40_30-6970268198064196499?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_49_18-3273922088151424709?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_57_58-5846888359597783451?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_40_29-3181349016400520741?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_49_00-2681170404176551898?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_57_42-10553025702807201767?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_40_27-16302393845825355878?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_48_53-3430586155157294642?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_40_29-4356236894997489397?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_49_05-13973450090694171758?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_58_07-18360793244096797190?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_40_27-13046923992540630792?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_49_05-978166987830681216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_15_58_09-14855856281652561149?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 25s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/4c3yyghi6b3ze

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #98

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/98/display/redirect?page=changes>

Changes:

[suztomo] grpc-google-cloud-pubsub-v1 1.85.1


------------------------------------------
[...truncated 5.43 MB...]
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-11T21:37:59.767668Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-11_14_37_58-16734925333878362507'
 location: u'us-central1'
 name: u'beamapp-jenkins-0311213741-757328'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-11T21:37:59.767668Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-11_14_37_58-16734925333878362507]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_37_58-16734925333878362507?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_14_37_58-16734925333878362507 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:37:58.685Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-11_14_37_58-16734925333878362507. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:37:58.685Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:37:58.686Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-11_14_37_58-16734925333878362507.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:02.228Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:03.099Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:03.780Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:03.815Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:03.884Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:03.924Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:03.955Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.040Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.152Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.360Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.494Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.540Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.570Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.605Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.636Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.668Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.705Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.739Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.773Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.810Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.849Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.885Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.919Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.955Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:04.989Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.030Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.063Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.091Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.132Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.164Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.190Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.215Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.250Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.299Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.334Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.371Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:05.406Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:09.687Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:09.730Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:09.756Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:24.110Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:38:39.756Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:39:17.355Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:39:17.395Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:44:08.832Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:45:13.373Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:45:13.447Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:45:13.483Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:45:13.516Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:45:13.539Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:46:05.376Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:46:05.420Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T21:46:05.461Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_14_37_58-16734925333878362507 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2170.713s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_10_28-6705373619091769417?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_20_02-11683173208731156559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_29_18-12621700821050789836?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_37_58-16734925333878362507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_10_28-3780664394819431595?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_18_36-15148634308423158978?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_27_57-13584129187597920577?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_10_28-3783948807270626698?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_19_04-1971414107488480389?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_28_10-7413510452249197020?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_10_28-18282638345582772393?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_19_11-14275620824960112644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_27_02-7483497707330850803?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_10_27-387767085752136276?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_19_08-12696200748028328990?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_28_40-914474220853050604?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_10_29-6425004982256377706?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_19_58-14704869348957181966?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_10_30-1584811051024333413?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_19_15-7353594158478345032?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_28_48-14943248718582575393?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_10_27-6978442530782171853?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_19_04-14130216979005757794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_14_28_19-14952444444034759732?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 22s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/jq67y4ompdsbc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #97

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/97/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-9295] Add Flink 1.10 build target and Make FlinkRunner compatible


------------------------------------------
[...truncated 5.44 MB...]
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-11T20:00:57.699942Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-11_13_00_56-7797717997438661157'
 location: u'us-central1'
 name: u'beamapp-jenkins-0311200038-387736'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-11T20:00:57.699942Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-11_13_00_56-7797717997438661157]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_13_00_56-7797717997438661157?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_13_00_56-7797717997438661157 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:00:56.708Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-11_13_00_56-7797717997438661157. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:00:56.708Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-11_13_00_56-7797717997438661157.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:00:56.708Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:01.269Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:02.094Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:02.734Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:02.768Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:02.846Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:02.886Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:02.934Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:02.972Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.004Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.130Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.239Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.311Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.342Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.400Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.437Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.473Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.511Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.549Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.612Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.650Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.686Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.729Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.775Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.808Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.842Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.878Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.915Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.955Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:03.989Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:04.029Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:04.070Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:04.107Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:04.141Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:04.183Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:04.235Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:04.269Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:04.304Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:04.340Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:06.681Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:06.719Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:06.757Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:31.558Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:01:37.076Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:02:17.612Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:02:17.643Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:07:05.650Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:08:08.260Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:08:08.324Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:08:08.356Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:08:08.408Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:08:08.442Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:08:59.942Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:08:59.999Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T20:09:00.039Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_13_00_56-7797717997438661157 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2119.033s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_34_10-5586420408327367536?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_42_43-17705868792791147880?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_51_23-6665272133030606476?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_34_12-7603866554099499557?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_41_26-14069010004457858250?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_50_01-5582300207446640627?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_34_10-10073256018954232851?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_42_51-9538453226822813964?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_51_32-2182513935666927406?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_34_13-4109150631490895008?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_42_47-6849818644688764624?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_52_17-9541912159551480454?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_13_00_56-7797717997438661157?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_34_13-5045556197498904223?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_43_47-15959996429270735007?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_34_13-13326327056375593379?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_42_58-3341350499424803619?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_52_07-10690880260577774316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_34_11-4898368324677826021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_42_44-1952485852678528052?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_51_14-13999777356154411969?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_34_14-6610327414334243556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_42_46-18239767880356452636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_12_51_27-15695691945878272582?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 23s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/x6pg6b7myogcw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #96

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/96/display/redirect?page=changes>

Changes:

[chadrik] Add pre-commit hook for pylint


------------------------------------------
[...truncated 5.51 MB...]
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Unkey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s19"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Unkey_29", 
        "user_name": "assert_that/Unkey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s21", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "assert_that/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s20"
        }, 
        "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: u'2020-03-11T18:31:17.030940Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-11_11_31_16-5913616157612663394'
 location: u'us-central1'
 name: u'beamapp-jenkins-0311183059-166118'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-11T18:31:17.030940Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-03-11_11_31_16-5913616157612663394]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_31_16-5913616157612663394?project=apache-beam-testing
WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_11_31_16-5913616157612663394 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:16.036Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-11_11_31_16-5913616157612663394. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:16.036Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:16.036Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-11_11_31_16-5913616157612663394.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:22.493Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:23.557Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.110Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.143Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.193Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.225Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.249Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.279Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.306Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.373Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.443Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.495Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.518Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.540Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.562Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.585Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2643>) into Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.612Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2643>) into assert_that/Create/Impulse
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.639Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.660Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.684Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2643>)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.728Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.758Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.783Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.807Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.833Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.862Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.890Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.918Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.944Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.973Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:24.999Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:25.032Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:25.056Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:25.082Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:25.130Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:25.161Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:25.190Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:25.217Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:33.846Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:33.869Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:33.902Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:41.346Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:41.395Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:41.429Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_11_22_39-14303522427772930798 is in state JOB_STATE_DONE
test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:56.444Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:31:58.680Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:32:35.903Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:32:35.939Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:37:26.494Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:38:35.353Z: JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:38:35.411Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:38:35.435Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:38:35.481Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:38:35.513Z: JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:39:18.774Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:39:18.818Z: JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-03-11T18:39:18.860Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-03-11_11_31_16-5913616157612663394 is in state JOB_STATE_DONE
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2134.198s

OK
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_04_18-12109934299552095895?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_12_50-4821968449324166481?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_22_45-14799792487712602903?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_31_16-5913616157612663394?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_04_17-14102520787414287539?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_14_02-14349309384888261269?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_04_18-12445804788715346198?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_11_51-8954254514363726746?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_20_23-8400216464815589172?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_04_19-11931292830335117894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_12_48-3343065305648528569?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_21_16-124344968612740789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_04_18-5746752110426001235?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_12_47-17290340523550310608?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_22_18-1629985618951031813?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_04_19-2123297970058676960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_13_10-10228227769112935228?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_21_51-4137994393772316856?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_04_19-1176066673575143575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_12_50-14506936253258406743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_22_39-14303522427772930798?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_04_17-2064325347767531991?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_12_47-17719107762445576790?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_11_21_17-4416410137020420670?project=apache-beam-testing

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 46s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/paiye5ervufks

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #95

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/95/display/redirect?page=changes>

Changes:

[12602502+Ardagan] [BEAM-9431] Remove ReadFromPubSub/Read-out0-ElementCount from the


------------------------------------------
[...truncated 49.25 KB...]
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:95)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$1(SkipUpToDateStep.java:90)
	at java.util.Optional.orElseGet(Optional.java:267)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:90)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)
	at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
	at org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
	at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
	at java.lang.Thread.run(Thread.java:748)

> Task :sdks:python:test-suites:dataflow:py2:setupVirtualenv FAILED

> Task :runners:java-fn-execution:compileJava
error: error reading <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/java/harness/build/libs/beam-sdks-java-harness-2.21.0-SNAPSHOT.jar;> error in opening zip file
An exception has occurred in the compiler ((version info not available)). Please file a bug against the Java compiler via the Java bug reporting page (http://bugreport.java.com) after checking the Bug Database (http://bugs.java.com) for duplicates. Include your program and the following diagnostic in your report. Thank you.
java.util.zip.ZipError: zip END header not found
	at com.sun.nio.zipfs.ZipFileSystem.zerror(ZipFileSystem.java:1605)
	at com.sun.nio.zipfs.ZipFileSystem.findEND(ZipFileSystem.java:1021)
	at com.sun.nio.zipfs.ZipFileSystem.initCEN(ZipFileSystem.java:1030)
	at com.sun.nio.zipfs.ZipFileSystem.<init>(ZipFileSystem.java:130)
	at com.sun.nio.zipfs.ZipFileSystemProvider.newFileSystem(ZipFileSystemProvider.java:139)
	at com.sun.tools.javac.file.JavacFileManager$ArchiveContainer.<init>(JavacFileManager.java:517)
	at com.sun.tools.javac.file.JavacFileManager.getContainer(JavacFileManager.java:319)
	at com.sun.tools.javac.file.JavacFileManager.list(JavacFileManager.java:715)
	at com.sun.tools.javac.code.ClassFinder.list(ClassFinder.java:726)
	at com.sun.tools.javac.code.ClassFinder.scanUserPaths(ClassFinder.java:659)
	at com.sun.tools.javac.code.ClassFinder.fillIn(ClassFinder.java:526)
	at com.sun.tools.javac.code.ClassFinder.complete(ClassFinder.java:293)
	at com.sun.tools.javac.code.Symbol.complete(Symbol.java:633)
	at com.sun.tools.javac.code.Symbol$PackageSymbol.members(Symbol.java:1120)
	at com.sun.tools.javac.code.Symtab.listPackageModules(Symtab.java:810)
	at com.sun.tools.javac.comp.Enter.visitTopLevel(Enter.java:344)
	at com.sun.tools.javac.tree.JCTree$JCCompilationUnit.accept(JCTree.java:529)
	at com.sun.tools.javac.comp.Enter.classEnter(Enter.java:285)
	at com.sun.tools.javac.comp.Enter.classEnter(Enter.java:300)
	at com.sun.tools.javac.comp.Enter.complete(Enter.java:570)
	at com.sun.tools.javac.comp.Enter.main(Enter.java:554)
	at com.sun.tools.javac.main.JavaCompiler.enterTrees(JavaCompiler.java:1052)
	at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:923)
	at com.sun.tools.javac.api.JavacTaskImpl.lambda$doCall$0(JavacTaskImpl.java:100)
	at com.sun.tools.javac.api.JavacTaskImpl.handleExceptions(JavacTaskImpl.java:142)
	at com.sun.tools.javac.api.JavacTaskImpl.doCall(JavacTaskImpl.java:96)
	at com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:90)
	at com.google.errorprone.BaseErrorProneCompiler.run(BaseErrorProneCompiler.java:137)
	at com.google.errorprone.BaseErrorProneCompiler.run(BaseErrorProneCompiler.java:108)
	at com.google.errorprone.ErrorProneCompiler.run(ErrorProneCompiler.java:118)
	at com.google.errorprone.ErrorProneCompiler.compile(ErrorProneCompiler.java:65)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:66)
	at net.ltgt.gradle.errorprone.ErrorProneCompiler.execute(ErrorProneCompiler.java:23)
	at org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.delegateAndHandleErrors(NormalizingJavaCompiler.java:100)
	at org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:52)
	at org.gradle.api.internal.tasks.compile.NormalizingJavaCompiler.execute(NormalizingJavaCompiler.java:38)
	at org.gradle.api.internal.tasks.compile.CleaningJavaCompilerSupport.execute(CleaningJavaCompilerSupport.java:39)
	at org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:110)
	at org.gradle.api.internal.tasks.compile.incremental.IncrementalCompilerFactory$2.execute(IncrementalCompilerFactory.java:106)
	at org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:59)
	at org.gradle.api.internal.tasks.compile.incremental.IncrementalResultStoringCompiler.execute(IncrementalResultStoringCompiler.java:43)
	at org.gradle.api.internal.tasks.compile.CompileJavaBuildOperationReportingCompiler$2.call(CompileJavaBuildOperationReportingCompiler.java:59)
	at org.gradle.api.internal.tasks.compile.CompileJavaBuildOperationReportingCompiler$2.call(CompileJavaBuildOperationReportingCompiler.java:51)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.api.internal.tasks.compile.CompileJavaBuildOperationReportingCompiler.execute(CompileJavaBuildOperationReportingCompiler.java:51)
	at org.gradle.api.tasks.compile.JavaCompile.performCompilation(JavaCompile.java:154)
	at org.gradle.api.tasks.compile.JavaCompile.compile(JavaCompile.java:122)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103)
	at org.gradle.api.internal.project.taskfactory.IncrementalTaskAction.doExecute(IncrementalTaskAction.java:73)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$4.run(ExecuteActionsTaskExecuter.java:338)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:327)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:312)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:75)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:158)
	at org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:46)
	at org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49)
	at org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34)
	at org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:49)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:42)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:28)
	at org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:133)
	at org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$5(CacheStep.java:83)
	at java.util.Optional.orElseGet(Optional.java:267)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:37)
	at org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:95)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$1(SkipUpToDateStep.java:90)
	at java.util.Optional.orElseGet(Optional.java:267)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:90)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)
	at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
	at org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
	at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
	at java.lang.Thread.run(Thread.java:748)

> Task :runners:java-fn-execution:compileJava FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:java-fn-execution:compileJava'.
> Compilation failed with exit code 1; see the compiler error output for details.

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 38s
57 actionable tasks: 41 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/mrmpaao76avle

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #94

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/94/display/redirect>

Changes:


------------------------------------------
[...truncated 65.34 KB...]
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:58)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:168)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction.visitFile(ShadowCopyAction.groovy:248)
	at sun.reflect.GeneratedMethodAccessor405.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:190)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:58)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:168)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$BaseStreamAction.processFile(ShadowCopyAction.groovy:183)
	at org.gradle.api.internal.file.copy.NormalizingCopyActionDecorator$1$1.processFile(NormalizingCopyActionDecorator.java:66)
	at org.gradle.api.internal.file.copy.DuplicateHandlingCopyActionDecorator$1$1.processFile(DuplicateHandlingCopyActionDecorator.java:60)
	at org.gradle.api.internal.file.copy.CopyFileVisitorImpl.processFile(CopyFileVisitorImpl.java:62)
	at org.gradle.api.internal.file.copy.CopyFileVisitorImpl.visitFile(CopyFileVisitorImpl.java:46)
	at org.gradle.api.internal.file.collections.AbstractSingletonFileTree.visit(AbstractSingletonFileTree.java:36)
	at org.gradle.api.internal.file.collections.FileTreeAdapter.visit(FileTreeAdapter.java:118)
	at org.gradle.api.internal.file.CompositeFileTree.visit(CompositeFileTree.java:93)
	at org.gradle.api.internal.file.copy.CopySpecActionImpl.execute(CopySpecActionImpl.java:39)
	at org.gradle.api.internal.file.copy.CopySpecActionImpl.execute(CopySpecActionImpl.java:24)
	at org.gradle.api.internal.file.copy.DefaultCopySpec$DefaultCopySpecResolver.walk(DefaultCopySpec.java:693)
	at org.gradle.api.internal.file.copy.DefaultCopySpec$DefaultCopySpecResolver.walk(DefaultCopySpec.java:695)
	at org.gradle.api.internal.file.copy.DefaultCopySpec.walk(DefaultCopySpec.java:499)
	at org.gradle.api.internal.file.copy.CopySpecBackedCopyActionProcessingStream.process(CopySpecBackedCopyActionProcessingStream.java:38)
	at org.gradle.api.internal.file.copy.DuplicateHandlingCopyActionDecorator$1.process(DuplicateHandlingCopyActionDecorator.java:44)
	at org.gradle.api.internal.file.copy.NormalizingCopyActionDecorator$1.process(NormalizingCopyActionDecorator.java:57)
	at org.gradle.api.internal.file.copy.CopyActionProcessingStream$process.call(Unknown Source)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$2.execute(ShadowCopyAction.groovy:110)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$2$execute.call(Unknown Source)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction.withResource(ShadowCopyAction.groovy:152)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:104)
	at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite$StaticMetaMethodSiteNoUnwrapNoCoerce.invoke(StaticMetaMethodSite.java:151)
	at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite.callStatic(StaticMetaMethodSite.java:102)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:216)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction.execute(ShadowCopyAction.groovy:107)
	at org.gradle.api.internal.file.copy.NormalizingCopyActionDecorator.execute(NormalizingCopyActionDecorator.java:53)
	at org.gradle.api.internal.file.copy.DuplicateHandlingCopyActionDecorator.execute(DuplicateHandlingCopyActionDecorator.java:42)
	at org.gradle.api.internal.file.copy.CopyActionExecuter.execute(CopyActionExecuter.java:40)
	at org.gradle.api.tasks.AbstractCopyTask.copy(AbstractCopyTask.java:179)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar.copy(ShadowJar.java:96)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:48)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:705)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:672)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$4.run(ExecuteActionsTaskExecuter.java:338)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:327)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:312)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:75)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:158)
	at org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:46)
	at org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49)
	at org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34)
	at org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:49)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:42)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:28)
	at org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:133)
	at org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$5(CacheStep.java:83)
	at java.util.Optional.orElseGet(Optional.java:267)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:37)
	at org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:95)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$1(SkipUpToDateStep.java:90)
	at java.util.Optional.orElseGet(Optional.java:267)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:90)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)
	at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
	at org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
	at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
	at org.gradle.execution.plan.DefaultPlanExecutor.process(DefaultPlanExecutor.java:74)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph.executeWithServices(DefaultTaskExecutionGraph.java:178)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph.execute(DefaultTaskExecutionGraph.java:154)
	at org.gradle.execution.SelectedTaskExecutionAction.execute(SelectedTaskExecutionAction.java:41)
	at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:40)
	at org.gradle.execution.DefaultBuildExecuter.access$000(DefaultBuildExecuter.java:24)
	at org.gradle.execution.DefaultBuildExecuter$1.proceed(DefaultBuildExecuter.java:46)
	at org.gradle.execution.DryRunBuildExecutionAction.execute(DryRunBuildExecutionAction.java:49)
	at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:40)
	at org.gradle.execution.DefaultBuildExecuter.execute(DefaultBuildExecuter.java:33)
	at org.gradle.initialization.DefaultGradleLauncher$ExecuteTasks.run(DefaultGradleLauncher.java:383)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.initialization.DefaultGradleLauncher.runTasks(DefaultGradleLauncher.java:247)
	at org.gradle.initialization.DefaultGradleLauncher.doBuildStages(DefaultGradleLauncher.java:159)
	at org.gradle.initialization.DefaultGradleLauncher.executeTasks(DefaultGradleLauncher.java:134)
	at org.gradle.internal.invocation.GradleBuildController$1.execute(GradleBuildController.java:58)
	at org.gradle.internal.invocation.GradleBuildController$1.execute(GradleBuildController.java:55)
	at org.gradle.internal.invocation.GradleBuildController$3.create(GradleBuildController.java:82)
	at org.gradle.internal.invocation.GradleBuildController$3.create(GradleBuildController.java:75)
	at org.gradle.internal.work.DefaultWorkerLeaseService.withLocks(DefaultWorkerLeaseService.java:183)
	at org.gradle.internal.work.StopShieldingWorkerLeaseService.withLocks(StopShieldingWorkerLeaseService.java:40)
	at org.gradle.internal.invocation.GradleBuildController.doBuild(GradleBuildController.java:75)
	at org.gradle.internal.invocation.GradleBuildController.run(GradleBuildController.java:55)
	at org.gradle.tooling.internal.provider.ExecuteBuildActionRunner.run(ExecuteBuildActionRunner.java:31)
	at org.gradle.launcher.exec.ChainingBuildActionRunner.run(ChainingBuildActionRunner.java:35)
	at org.gradle.launcher.exec.BuildOutcomeReportingBuildActionRunner.run(BuildOutcomeReportingBuildActionRunner.java:58)
	at org.gradle.tooling.internal.provider.ValidatingBuildActionRunner.run(ValidatingBuildActionRunner.java:32)
	at org.gradle.launcher.exec.BuildCompletionNotifyingBuildActionRunner.run(BuildCompletionNotifyingBuildActionRunner.java:39)
	at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:49)
	at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner$3.call(RunAsBuildOperationBuildActionRunner.java:44)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.launcher.exec.RunAsBuildOperationBuildActionRunner.run(RunAsBuildOperationBuildActionRunner.java:44)
	at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:49)
	at org.gradle.launcher.exec.InProcessBuildActionExecuter$1.transform(InProcessBuildActionExecuter.java:46)
	at org.gradle.composite.internal.DefaultRootBuildState.run(DefaultRootBuildState.java:78)
	at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:46)
	at org.gradle.launcher.exec.InProcessBuildActionExecuter.execute(InProcessBuildActionExecuter.java:31)
	at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:42)
	at org.gradle.launcher.exec.BuildTreeScopeBuildActionExecuter.execute(BuildTreeScopeBuildActionExecuter.java:28)
	at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:78)
	at org.gradle.tooling.internal.provider.ContinuousBuildActionExecuter.execute(ContinuousBuildActionExecuter.java:52)
	at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:59)
	at org.gradle.tooling.internal.provider.SubscribableBuildActionExecuter.execute(SubscribableBuildActionExecuter.java:36)
	at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:68)
	at org.gradle.tooling.internal.provider.SessionScopeBuildActionExecuter.execute(SessionScopeBuildActionExecuter.java:38)
	at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:37)
	at org.gradle.tooling.internal.provider.GradleThreadBuildActionExecuter.execute(GradleThreadBuildActionExecuter.java:26)
	at org.gradle.tooling.internal.provider.ParallelismConfigurationBuildActionExecuter.execute(ParallelismConfigurationBuildActionExecuter.java:43)
	at org.gradle.tooling.internal.provider.ParallelismConfigurationBuildActionExecuter.execute(ParallelismConfigurationBuildActionExecuter.java:29)
	at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:60)
	at org.gradle.tooling.internal.provider.StartParamsValidatingActionExecuter.execute(StartParamsValidatingActionExecuter.java:32)
	at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:55)
	at org.gradle.tooling.internal.provider.SessionFailureReportingActionExecuter.execute(SessionFailureReportingActionExecuter.java:41)
	at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:48)
	at org.gradle.tooling.internal.provider.SetupLoggingActionExecuter.execute(SetupLoggingActionExecuter.java:32)
	at org.gradle.launcher.daemon.server.exec.ExecuteBuild.doBuild(ExecuteBuild.java:67)
	at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:36)
	at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
	at org.gradle.launcher.daemon.server.exec.WatchForDisconnection.execute(WatchForDisconnection.java:37)
	at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
	at org.gradle.launcher.daemon.server.exec.ResetDeprecationLogger.execute(ResetDeprecationLogger.java:26)
	at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
	at org.gradle.launcher.daemon.server.exec.RequestStopIfSingleUsedDaemon.execute(RequestStopIfSingleUsedDaemon.java:34)
	at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
	at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.call(ForwardClientInput.java:74)
	at org.gradle.launcher.daemon.server.exec.ForwardClientInput$2.call(ForwardClientInput.java:72)
	at org.gradle.util.Swapper.swap(Swapper.java:38)
	at org.gradle.launcher.daemon.server.exec.ForwardClientInput.execute(ForwardClientInput.java:72)
	at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
	at org.gradle.launcher.daemon.server.exec.LogAndCheckHealth.execute(LogAndCheckHealth.java:55)
	at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
	at org.gradle.launcher.daemon.server.exec.LogToClient.doBuild(LogToClient.java:62)
	at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:36)
	at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
	at org.gradle.launcher.daemon.server.exec.EstablishBuildEnvironment.doBuild(EstablishBuildEnvironment.java:81)
	at org.gradle.launcher.daemon.server.exec.BuildCommandOnly.execute(BuildCommandOnly.java:36)
	at org.gradle.launcher.daemon.server.api.DaemonCommandExecution.proceed(DaemonCommandExecution.java:104)
	at org.gradle.launcher.daemon.server.exec.StartBuildOrRespondWithBusy$1.run(StartBuildOrRespondWithBusy.java:50)
	at org.gradle.launcher.daemon.server.DaemonStateCoordinator$1.run(DaemonStateCoordinator.java:295)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
	at java.lang.Thread.run(Thread.java:748)

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 3s
62 actionable tasks: 44 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/qsrqacdoi3ixm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #93

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/93/display/redirect>

Changes:


------------------------------------------
[...truncated 5.55 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2cf57d51-a048-4687-8306-bc816a0b4860"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2cf57d51-a048-4687-8306-bc816a0b4860", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output2cf57d51-a048-4687-8306-bc816a0b4860", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-11T06:46:21.550668Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_23_46_20-2930105613095516448'
 location: u'us-central1'
 name: u'beamapp-jenkins-0311064559-808493'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-11T06:46:21.550668Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_23_46_20-2930105613095516448]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_46_20-2930105613095516448?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_23_46_20-2930105613095516448 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:20.573Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_23_46_20-2930105613095516448. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:20.573Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:20.573Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_23_46_20-2930105613095516448.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:23.568Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:27.386Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:28.084Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:28.403Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:28.577Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:28.668Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:28.729Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:28.803Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:28.858Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:29.065Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:29.113Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:29.139Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:29.191Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:29.231Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:29.258Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:29.280Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:31.689Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:31.789Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:31.843Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:50.513Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:46:55.522Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:47:31.062Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T06:47:31.102Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_23_46_20-2930105613095516448 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2301.524s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_46_20-653244888607751791?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_55_42-7072372997108744723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_00_05_32-3872878315275776623?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_00_15_36-17417928316221147357?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_46_20-8161987173511063587?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_55_39-6829195721151448356?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_00_04_35-8557083821282384483?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_46_20-2930105613095516448?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_54_35-515864849536081738?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_00_03_39-3557407317158213682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_46_22-6723745013299273971?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_56_10-2288101927703439207?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_00_05_20-5286674956637785164?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_46_22-3302899930967398429?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_54_42-16879001192272881705?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_00_03_11-8534086246161143132?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_46_19-4341939464080592793?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_55_36-16712156770722682461?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_00_04_34-8113416113842678237?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_46_22-14472719127165775960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_56_18-5387627293502040564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-11_00_04_44-6581323231224847307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_46_24-850421407842126592?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_23_56_09-4370821474270015367?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 7s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/yabu3khleneda

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #92

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/92/display/redirect>

Changes:


------------------------------------------
[...truncated 5.49 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input941f4d42-67ee-4640-abc3-1a188d248b74"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input941f4d42-67ee-4640-abc3-1a188d248b74", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output941f4d42-67ee-4640-abc3-1a188d248b74", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-11T01:45:38.662980Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_18_45_37-16082134647934128089'
 location: u'us-central1'
 name: u'beamapp-jenkins-0311014506-771980'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-11T01:45:38.662980Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_18_45_37-16082134647934128089]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_45_37-16082134647934128089?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_18_45_37-16082134647934128089 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:37.286Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_18_45_37-16082134647934128089. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:37.286Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:37.287Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_18_45_37-16082134647934128089.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:41.343Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:42.204Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:42.829Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:42.868Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:42.939Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:42.980Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.007Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.044Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.101Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.160Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.198Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.231Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.276Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.303Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.330Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:43.365Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:45.654Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:45.688Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:45:45.727Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:46:10.878Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:46:15.998Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:46:44.025Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T01:46:44.054Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_18_45_37-16082134647934128089 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2159.421s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_45_54-10517967597666290689?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_55_12-15876987270535584243?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_19_03_46-12741910120816766413?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_19_12_35-7443978005006036234?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_45_44-13438457583875098493?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_54_22-11448807186390636462?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_19_02_48-37387324574108489?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_45_37-16082134647934128089?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_53_48-1796945420716880766?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_19_02_52-6409523449932691372?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_45_49-14647527049222052920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_54_28-13081146749743352944?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_19_03_16-17078775784736853286?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_45_52-13949035955257334078?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_54_25-13089356175108907545?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_19_03_00-9538263174634307847?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_45_36-11414289822855056875?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_54_16-13195239635731140926?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_19_02_45-15136635593645937779?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_45_49-8491001385963303877?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_54_34-4651275276844364775?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_19_03_17-6538318956421538391?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_45_45-6011917263882894633?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_18_54_30-15727987685804698313?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 41s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/256tybnw2lhve

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #91

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/91/display/redirect?page=changes>

Changes:

[github] [BEAM-9481] Exclude signature files from expansion service test

[github] Install typing package only for Python < 3.5.3 (#10821)


------------------------------------------
[...truncated 5.46 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input9e99eeb4-4890-4b55-9780-e3e8e836fa90"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input9e99eeb4-4890-4b55-9780-e3e8e836fa90", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output9e99eeb4-4890-4b55-9780-e3e8e836fa90", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-11T00:04:25.928163Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_17_04_24-13317423559333771843'
 location: u'us-central1'
 name: u'beamapp-jenkins-0311000405-637442'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-11T00:04:25.928163Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_17_04_24-13317423559333771843]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_04_24-13317423559333771843?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_17_04_24-13317423559333771843 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:24.633Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:24.633Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_17_04_24-13317423559333771843.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:24.633Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_17_04_24-13317423559333771843. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:29.648Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:30.408Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:30.927Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:30.959Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.030Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.069Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.099Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.135Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.170Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.231Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.265Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.292Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.335Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.365Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.402Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:04:31.443Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:05:00.805Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:05:06.065Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:05:06.101Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:05:06.139Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-11T00:05:32.775Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_17_04_24-13317423559333771843 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2139.066s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_04_24-6144417055377107701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_14_05-9794551108871304880?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_22_30-4821645424109653405?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_31_20-13089953374452165629?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_04_24-13317423559333771843?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_12_32-1946821269558138157?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_21_04-16513471114094758709?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_04_25-15256922910332784794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_12_59-17802511031581617021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_21_28-517850606387638318?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_04_23-1790261412216013862?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_12_58-8691924124128495842?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_04_23-2728358784663255390?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_13_06-10643190023939720173?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_21_41-2796861629490888911?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_04_21-13801305348959811719?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_12_58-8733724579949034207?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_21_27-14808391854517998942?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_04_24-14816701083854825014?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_13_12-2529657778465220713?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_21_46-14177232932084035460?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_04_22-7386988258168806183?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_12_57-11662960374419696457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_17_21_36-1531086352335996147?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 54s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/b2resvvqrkrtm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #90

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/90/display/redirect?page=changes>

Changes:

[pabloem] Updating BigQuery client APIs


------------------------------------------
[...truncated 5.50 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputc09d58c8-c649-4c4b-874d-09efea5d8c20"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputc09d58c8-c649-4c4b-874d-09efea5d8c20", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputc09d58c8-c649-4c4b-874d-09efea5d8c20", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T22:33:40.506727Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_15_33_39-788612364635925832'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310223324-267941'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T22:33:40.506727Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_15_33_39-788612364635925832]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_33_39-788612364635925832?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_15_33_39-788612364635925832 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:39.339Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_15_33_39-788612364635925832.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:39.339Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:39.339Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_15_33_39-788612364635925832. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:42.713Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:43.768Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.390Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.424Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.488Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.536Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.567Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.592Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.618Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.664Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.694Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.726Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.761Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.796Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.821Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:44.854Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:47.087Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:47.123Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:33:47.160Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:34:14.579Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:34:16.922Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:34:43.635Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T22:34:43.669Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_15_33_39-788612364635925832 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2084.335s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_33_40-1304277586207512708?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_42_40-15252167757831753978?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_51_14-10007575617971582857?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_59_49-11817782451690164545?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_33_37-2310583332362565128?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_42_37-3473115917300926790?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_51_05-12211556059053402958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_33_39-788612364635925832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_41_22-7133565170839949236?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_50_09-11977801977283616315?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_33_38-13435016141009901935?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_42_10-13607724730561520967?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_50_54-7505496336527872825?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_33_36-11321054640987592295?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_41_26-12361016701169939319?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_50_07-15430026840846595121?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_33_39-4037794631491638265?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_43_02-18540587249480111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_33_39-16515625426826071051?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_42_26-14427095625749682818?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_50_54-1013581750777196913?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_33_37-6046964520486139393?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_42_37-17815969894829568040?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_15_51_02-17609167529881961956?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 45s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/4pnaznhnqok3u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #89

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/89/display/redirect?page=changes>

Changes:

[rohde.samuel] BEAM[8335] TestStreamService integration with DirectRunner


------------------------------------------
[...truncated 5.48 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input8a06ac84-4e6d-4107-bde4-98cfe8982894"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input8a06ac84-4e6d-4107-bde4-98cfe8982894", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output8a06ac84-4e6d-4107-bde4-98cfe8982894", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T20:01:29.142574Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_13_01_27-16448572980664057987'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310200114-696305'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T20:01:29.142574Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_13_01_27-16448572980664057987]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_01_27-16448572980664057987?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_13_01_27-16448572980664057987 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:27.935Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_13_01_27-16448572980664057987.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:27.935Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:27.935Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_13_01_27-16448572980664057987. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:31.511Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:32.873Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.443Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.479Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.559Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.597Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.627Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.658Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.690Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.751Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.789Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.825Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.873Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.911Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.936Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:33.975Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:36.246Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:36.286Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:01:36.322Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:02:00.598Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:02:08.322Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:02:28.501Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T20:02:28.540Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_13_01_27-16448572980664057987 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2075.449s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_01_28-1775295943675013801?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_10_10-1348366938266360540?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_18_54-11937028794755376224?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_27_25-13243883362251401456?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_01_28-9244993176103357870?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_09_54-17363506608182988406?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_18_28-7819536601192051478?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_01_27-16448572980664057987?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_08_37-13720952423129308740?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_17_21-16458462683231948814?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_01_28-8075579428612157703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_10_32-4257389038935829401?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_01_27-18278429726647781545?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_09_54-16999872237919033683?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_18_39-5332354326732799332?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_01_27-7936151666700044933?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_10_13-10004922975209088205?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_18_45-12437089356811446948?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_01_25-14309349074243694925?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_09_52-9826975285319165260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_18_27-11087559598637377593?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_01_26-16863425400634316674?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_09_58-9526730091415262108?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_13_18_22-9306480503285230090?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 14m 47s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/jm2h3vb4eqtqm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #88

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/88/display/redirect?page=changes>

Changes:

[apilloud] [BEAM-9411] Enable BigQuery DIRECT_READ by default in SQL

[github] [BEAM-9478] Update samza runner page to reflect post 1.0 changes


------------------------------------------
[...truncated 5.48 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb9eeaf8d-b489-4758-b00f-aff4a12cdf65"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb9eeaf8d-b489-4758-b00f-aff4a12cdf65", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputb9eeaf8d-b489-4758-b00f-aff4a12cdf65", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T18:40:19.231029Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_11_40_18-10360734027031000142'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310184002-056967'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T18:40:19.231029Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_11_40_18-10360734027031000142]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_40_18-10360734027031000142?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_11_40_18-10360734027031000142 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:18.167Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_11_40_18-10360734027031000142.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:18.167Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_11_40_18-10360734027031000142. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:18.167Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:21.189Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:22.210Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:22.803Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:22.828Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:22.887Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:22.925Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:22.954Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:22.977Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:23.001Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:23.058Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:23.126Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:23.163Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:23.226Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:23.256Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:23.277Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:23.303Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:25.578Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:25.607Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:25.631Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:40:57.090Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:41:01.469Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:41:24.400Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T18:41:24.429Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_11_40_18-10360734027031000142 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2117.470s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_40_20-371784593596547982?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_50_34-18381746345044687593?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_58_24-13117172985117996682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_12_07_00-17096842744661503768?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_40_17-6119285855019517572?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_48_54-4681388474226148847?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_58_21-16551430573500510371?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_40_18-10360734027031000142?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_47_51-12169775049313837479?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_56_35-8437387575892240416?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_40_17-14463507260230615091?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_49_55-7119797313165866889?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_40_18-3380295860531974676?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_48_54-6389007655099010923?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_57_25-6168082746304323169?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_40_16-11031569362950026604?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_48_28-12498800335066827107?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_57_03-8646401972752935491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_40_18-9260885405421208229?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_49_09-3330688389778495773?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_57_40-15500342741687815697?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_40_16-11707147962768875667?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_49_56-4979370958491350353?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_11_58_25-16190959668751225622?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 14s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/hihesscquuvug

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #87

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/87/display/redirect?page=changes>

Changes:

[suztomo] grpc 1.27.2 and gax 1.54.0

[suztomo] bigquerystorage 0.125.0-beta


------------------------------------------
[...truncated 5.46 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input60409f52-f6a0-4367-93dc-a8cc57f40d7f"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input60409f52-f6a0-4367-93dc-a8cc57f40d7f", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output60409f52-f6a0-4367-93dc-a8cc57f40d7f", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T17:11:20.592929Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_10_11_19-2622182646573114736'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310171103-000885'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T17:11:20.592929Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_10_11_19-2622182646573114736]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_11_19-2622182646573114736?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_10_11_19-2622182646573114736 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:19.132Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_10_11_19-2622182646573114736.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:19.132Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:19.132Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_10_11_19-2622182646573114736. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:22.625Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:23.426Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:23.993Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.019Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.080Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.118Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.147Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.181Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.206Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.249Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.278Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.305Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.347Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.380Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.413Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:24.445Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:26.691Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:26.725Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:26.753Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:49.760Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:11:51.123Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:12:25.979Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T17:12:26.011Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_10_11_19-2622182646573114736 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2276.817s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_11_17-17347012062446600382?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_20_10-10925177158141063066?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_28_33-3677875822038942600?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_11_19-2622182646573114736?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_18_54-6275621252897977090?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_28_19-18220002326108116583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_11_20-7044620633246011960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_20_27-281941373513519033?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_30_19-8071138352580482606?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_11_21-11665619693796211476?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_20_28-7452952239559795243?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_30_14-1505167363370129387?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_39_48-13056615266577854855?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_11_16-16971027844631166958?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_19_33-2122131964423779924?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_29_17-10125739179013559191?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_11_20-11675813486303067039?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_20_40-14226233760512772300?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_11_19-5915918522653350093?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_20_36-14831019693184737449?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_30_05-16478261943022267282?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_11_17-5218394351978206443?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_20_31-513088612721111553?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_10_30_02-1148693488899354207?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 0s
64 actionable tasks: 49 executed, 15 from cache

Publishing build scan...
https://gradle.com/s/6btal6coa3ypo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #86

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/86/display/redirect?page=changes>

Changes:

[echauchot] Fix wrong generated code comment.


------------------------------------------
[...truncated 5.53 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input346321e4-1dd3-4a08-a7c9-78531341fb0a"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input346321e4-1dd3-4a08-a7c9-78531341fb0a", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output346321e4-1dd3-4a08-a7c9-78531341fb0a", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T15:44:42.553784Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_08_44_41-2089565878671367952'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310154421-694324'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T15:44:42.553784Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_08_44_41-2089565878671367952]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_44_41-2089565878671367952?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_08_44_41-2089565878671367952 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:41.390Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_08_44_41-2089565878671367952.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:41.390Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_08_44_41-2089565878671367952. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:41.390Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:44.559Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.060Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.652Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.683Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.736Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.769Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.807Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.831Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.857Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.910Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.939Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.965Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:46.995Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:47.025Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:47.046Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:47.075Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:49.340Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:49.361Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:44:49.394Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:45:07.330Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T15:45:17.733Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_08_44_41-2089565878671367952 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2152.756s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_44_38-12313996173848998929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_53_44-11545853630697911276?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_09_02_50-11172641625338180569?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_09_11_49-11683381167349460249?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_44_41-2089565878671367952?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_52_06-12232494914432068118?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_09_02_03-2287070923244727452?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_44_39-11100489348260787945?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_53_41-1461765146746493511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_09_02_27-9755637183529251685?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_44_39-4326854239421792175?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_53_29-5615790836060640273?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_09_02_25-11226305354494459529?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_44_37-8381044146814379666?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_53_30-9447602826971713350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_09_02_32-10354793180038348023?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_44_39-17502385506932070830?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_53_31-2595165472040199226?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_09_02_45-13366127158572701721?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_44_40-10549249629241407874?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_53_24-9600157672900894537?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_09_02_32-6540194370316506544?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_44_40-13478457191859192794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_08_53_37-1247682646475804330?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 29s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/bzdc4ykpyhrpe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #85

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/85/display/redirect>

Changes:


------------------------------------------
[...truncated 5.48 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input609f73ea-2768-4bac-bdf5-c0c167d607e4"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input609f73ea-2768-4bac-bdf5-c0c167d607e4", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output609f73ea-2768-4bac-bdf5-c0c167d607e4", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T12:44:34.046946Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_05_44_32-2499431303397003860'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310124417-814267'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T12:44:34.046946Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_05_44_32-2499431303397003860]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_44_32-2499431303397003860?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_05_44_32-2499431303397003860 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:32.956Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_05_44_32-2499431303397003860. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:32.956Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:32.956Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_05_44_32-2499431303397003860.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:36.308Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:36.993Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:37.645Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:37.687Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:37.758Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:37.789Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:37.824Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:37.864Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:37.900Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:37.971Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:38.019Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:38.051Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:38.082Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:38.111Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:38.150Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:38.183Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:41.430Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:41.470Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:41.511Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:44:47.120Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T12:45:05.731Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_05_44_32-2499431303397003860 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2164.848s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_44_32-16648565660662317411?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_53_43-5978424716927541942?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_06_03_09-1734987944934320450?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_06_11_13-1336351508060463555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_44_32-2499431303397003860?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_52_22-4577898195321694083?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_06_01_48-14202285760256967989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_44_34-6493752432396635043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_53_58-13116513966631717735?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_06_03_03-17594606074063901223?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_44_33-1506263844639826986?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_53_53-16736025572635947989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_06_03_10-13387293437284599194?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_44_31-13075048348123554149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_54_37-12099646461501862555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_44_33-14828433841204327821?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_53_49-16756286101395163159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_06_03_05-2791934191255122478?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_44_34-17505474559462041683?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_53_49-14973854797043417179?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_06_02_54-1110833877504243174?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_44_32-6698122646603360232?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_05_53_28-4548377143470484793?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_06_02_49-17652855477430461384?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 15m 59s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/foeu55i6q7r5a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #84

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/84/display/redirect>

Changes:


------------------------------------------
[...truncated 5.48 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb81c4974-ec87-4f93-9e70-00e54753c7a5"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb81c4974-ec87-4f93-9e70-00e54753c7a5", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputb81c4974-ec87-4f93-9e70-00e54753c7a5", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T07:40:08.485208Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-10_00_40_07-13821160157730593504'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310073951-706780'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T07:40:08.485208Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-10_00_40_07-13821160157730593504]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_40_07-13821160157730593504?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-10_00_40_07-13821160157730593504 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:07.281Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:07.281Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-10_00_40_07-13821160157730593504. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:07.281Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-10_00_40_07-13821160157730593504.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:10.730Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:11.426Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.044Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.077Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.135Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.171Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.197Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.229Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.260Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.309Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.335Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.366Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.403Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.435Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.469Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:12.543Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:14.765Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:14.794Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:14.828Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:39.259Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T07:40:46.126Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-10_00_40_07-13821160157730593504 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2201.366s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_40_04-7005706628606916327?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_48_51-17070407656594161777?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_58_42-12653269387889119169?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_01_08_03-1457365093856398044?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_40_07-13821160157730593504?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_48_12-13602666245559174679?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_56_53-11480347570842207067?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_40_19-12002433881070967412?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_49_06-14721642197306050996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_57_57-15208118167605708839?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_40_06-11174901781247110531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_49_11-2997228533348768669?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_58_06-4668701902567405822?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_40_04-1392383833263724062?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_49_01-400244648788771960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_57_47-10420711199537904055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_40_06-9626187480994514605?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_48_57-16803012211696891898?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_57_48-17669297182034589477?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_40_06-16883382953142215154?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_49_01-9061495747930966943?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_58_41-10855058870701715781?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_40_05-13783708557824696871?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-10_00_49_56-4735890578162685688?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 39s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/pkt5vbmfkr57y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #83

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/83/display/redirect?page=changes>

Changes:

[github] Additional new Python Katas (#11078)


------------------------------------------
[...truncated 5.47 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inpute5bbe205-86f3-4ef4-9f0f-3a23cb520b99"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inpute5bbe205-86f3-4ef4-9f0f-3a23cb520b99", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outpute5bbe205-86f3-4ef4-9f0f-3a23cb520b99", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T06:22:13.743547Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-09_23_22_12-4870363560986972768'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310062158-754042'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T06:22:13.743547Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-09_23_22_12-4870363560986972768]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_22_12-4870363560986972768?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-09_23_22_12-4870363560986972768 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:12.734Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-09_23_22_12-4870363560986972768.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:12.734Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-09_23_22_12-4870363560986972768. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:12.734Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:16.192Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:18.282Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:18.888Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:18.919Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:18.975Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.081Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.114Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.153Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.181Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.241Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.280Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.309Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.349Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.379Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.418Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:19.455Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:21.695Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:21.719Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:21.756Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:45.289Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T06:22:54.276Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-09_23_22_12-4870363560986972768 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2130.508s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_22_14-17094300772872691030?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_31_30-15360225236537387580?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_40_50-12912988478316747927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_48_40-5118104390096665266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_22_12-15108699061277831126?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_31_37-14025714258524294009?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_39_36-8636377439670346240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_22_12-4870363560986972768?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_29_56-14680138429926187023?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_37_59-2562970143810986262?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_22_15-341989814371186996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_31_29-3431640032893810404?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_39_53-11632576072721712235?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_22_10-16190697559410500104?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_30_20-15461022544283599199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_39_09-9737060543921344942?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_22_14-2598248326278765371?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_32_02-3353610046502857669?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_22_13-17339630235410412532?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_31_36-13653263017581089441?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_40_48-1288370334696553365?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_22_11-8929828757493731936?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_31_01-3254712890302516138?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_23_40_21-5214068086577958271?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 29s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/lqcbaco3zxq7s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #82

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/82/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-2939, BEAM-9458] Use deduplication transform for UnboundedSources


------------------------------------------
[...truncated 5.50 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input03829b31-daff-4ca3-90db-f98b8fb3f953"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input03829b31-daff-4ca3-90db-f98b8fb3f953", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output03829b31-daff-4ca3-90db-f98b8fb3f953", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T04:05:36.910033Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-09_21_05_35-6747770022331408789'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310040518-340138'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T04:05:36.910033Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-09_21_05_35-6747770022331408789]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_05_35-6747770022331408789?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-09_21_05_35-6747770022331408789 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:35.941Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:35.941Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-09_21_05_35-6747770022331408789.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:35.941Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-09_21_05_35-6747770022331408789. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:39.171Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:40.168Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:40.751Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:40.778Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:40.835Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:40.875Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:40.903Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:40.932Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:40.953Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:41.007Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:41.035Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:41.070Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:41.110Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:41.137Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:41.171Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:41.203Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:43.492Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:43.536Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:05:43.566Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:06:13.741Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T04:06:19.159Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-09_21_05_35-6747770022331408789 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2209.277s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_05_34-3710571088807491428?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_14_28-12475282034841757895?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_24_42-11371783110000285362?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_33_07-11037150683238563988?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_05_34-18279881602972770886?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_14_26-14088850209938295469?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_23_25-15943426044412332669?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_05_35-6747770022331408789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_14_20-17923656169153224782?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_23_25-4716018491784129127?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_05_35-12441515242866640910?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_15_39-1392798229517885509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_24_15-5844275232164476270?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_05_35-18196653390259981322?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_14_27-15700767679329174648?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_23_11-15718022466351547177?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_05_36-16235036648724610360?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_14_32-15157797076458566498?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_05_36-2057422835269558460?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_14_30-10194262212361296362?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_23_30-16144852421668059367?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_05_33-9395636216740615078?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_14_15-11678713645155384508?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_21_24_32-6019599981525666238?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 58s
64 actionable tasks: 55 executed, 9 from cache

Publishing build scan...
https://gradle.com/s/ypk7cfq5i6yrc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #81

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/81/display/redirect>

Changes:


------------------------------------------
[...truncated 5.52 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input1105bf45-dc93-416f-90a2-0a7a94b029ce"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input1105bf45-dc93-416f-90a2-0a7a94b029ce", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output1105bf45-dc93-416f-90a2-0a7a94b029ce", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T02:21:55.663016Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-09_19_21_54-16591147715307495948'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310022139-879508'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T02:21:55.663016Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-09_19_21_54-16591147715307495948]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_21_54-16591147715307495948?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-09_19_21_54-16591147715307495948 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:54.555Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:54.556Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-09_19_21_54-16591147715307495948. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:54.556Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-09_19_21_54-16591147715307495948.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:57.567Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:58.455Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.078Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.109Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.190Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.234Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.269Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.307Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.342Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.393Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.428Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.458Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.503Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.537Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.576Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:21:59.607Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:22:01.890Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:22:01.930Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:22:01.968Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:22:25.112Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T02:22:33.496Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-09_19_21_54-16591147715307495948 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2184.746s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_21_55-8197679091496948267?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_30_52-5391547260755794152?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_40_16-9794945093718797925?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_49_05-8728565489128412280?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_21_54-13657531023391620823?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_30_53-17877078747886860673?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_39_56-2366453445633019339?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_21_54-16591147715307495948?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_30_06-9432403533845875694?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_40_14-212362765073241674?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_21_56-10961833378028838437?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_30_57-7395179231931622079?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_39_57-8533160201051661423?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_21_53-7112373415638805538?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_30_49-12174520293607914105?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_39_55-12762400446755274727?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_21_56-6733961431175612908?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_31_05-4339779577133341762?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_39_54-1362419914199025605?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_21_55-9859687391561592266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_30_52-5461216991890063181?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_40_01-17423720185523848665?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_21_54-7375407145534349176?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_19_30_58-15370936948044582694?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 15s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/3wbiqcwstljzq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #80

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/80/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-9475] Fix typos and shore up expectations on type

[github] [BEAM-7926] Update Data Visualization (#11020)


------------------------------------------
[...truncated 5.50 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputdb9fba47-0ce2-4268-bdf6-27efdb776530"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputdb9fba47-0ce2-4268-bdf6-27efdb776530", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputdb9fba47-0ce2-4268-bdf6-27efdb776530", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-10T00:22:19.636099Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-09_17_22_18-7557664430125777563'
 location: u'us-central1'
 name: u'beamapp-jenkins-0310002158-675374'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-10T00:22:19.636099Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-09_17_22_18-7557664430125777563]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_22_18-7557664430125777563?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-09_17_22_18-7557664430125777563 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:18.571Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:18.571Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-09_17_22_18-7557664430125777563.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:18.571Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-09_17_22_18-7557664430125777563. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:21.680Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:22.763Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.479Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.516Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.597Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.640Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.671Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.709Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.741Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.800Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.838Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.874Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.918Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.960Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:23.983Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:24.009Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:26.255Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:26.306Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:26.360Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:39.642Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:22:50.521Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:23:23.948Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-10T00:23:23.989Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-09_17_22_18-7557664430125777563 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2207.060s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_22_15-11616083664438441428?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_31_11-16680983084707888875?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_41_06-9107138966820200395?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_49_59-2771424394200034007?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_22_18-10997207547327432575?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_32_32-6696421680122559844?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_22_18-7557664430125777563?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_29_55-16435828556587239164?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_39_03-6571261766429002528?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_22_21-3471656781593105161?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_31_47-13180734082003260791?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_40_35-15330376373575253194?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_22_20-5911579759517868531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_31_39-7855368058291872488?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_40_35-4747435877890251471?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_22_19-3711376620532475720?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_31_35-2950006080066093827?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_41_00-4320163350038667344?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_22_21-17062304827606765472?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_31_37-174428501087373682?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_40_51-17642844694700734132?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_22_18-3930342814236790382?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_31_22-16616898332978068779?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_17_40_47-14152399800621337968?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 21m 33s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/joxtofsydmq22

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #79

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/79/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-2939] Follow-up on comment in pr/11065

[lcwik] [BEAM-9473] Dont copy over META-INF index/checksum/signing files during

[hannahjiang] update CHANGE.md for 2.20


------------------------------------------
[...truncated 5.54 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputba17f1ac-58ab-4909-a47a-7f2b4f027f0e"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputba17f1ac-58ab-4909-a47a-7f2b4f027f0e", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputba17f1ac-58ab-4909-a47a-7f2b4f027f0e", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-09T21:59:14.394972Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-09_14_59_13-7155352181145290358'
 location: u'us-central1'
 name: u'beamapp-jenkins-0309215856-514506'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-09T21:59:14.394972Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-09_14_59_13-7155352181145290358]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_59_13-7155352181145290358?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-09_14_59_13-7155352181145290358 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:13.199Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:13.199Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-09_14_59_13-7155352181145290358.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:13.199Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-09_14_59_13-7155352181145290358. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:16.692Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:17.617Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.171Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.199Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.246Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.277Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.302Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.329Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.352Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.407Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.428Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.453Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.479Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.502Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.530Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:18.556Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:20.792Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:20.833Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:20.873Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:45.444Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T21:59:48.696Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-09_14_59_13-7155352181145290358 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2300.510s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_59_11-3580331190458104036?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_08_07-15613982849202331735?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_17_04-2389851621593947609?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_59_13-7155352181145290358?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_07_54-9041656179327816123?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_16_51-12532095590411054069?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_59_12-6908917915578119956?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_08_11-1208884590832378890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_17_11-15551099429419919668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_59_15-16003641912479255603?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_08_17-8758600286298541514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_17_15-9184748001310381000?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_59_11-16226510170657886987?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_08_51-13343019668195495481?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_18_30-17080998892167987097?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_59_12-17062306168112195804?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_08_17-3716515382925885076?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_59_15-10761003167903428581?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_09_20-10527163152330453051?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_18_18-7159692551982189743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_59_16-6988184932673196353?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_08_15-9789424414670789832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_18_27-6621064571098202319?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_15_28_37-1560681437555021362?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 56s
64 actionable tasks: 49 executed, 15 from cache

Publishing build scan...
https://gradle.com/s/4c3xlm5ymfpbi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #78

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/78/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-9448] Fix log message for job server cache.

[kcweaver] Downgrade cache log level from warn->info.

[apilloud] [BEAM-9463] Bump ZetaSQL to 2020.03.1

[github] [BEAM-9319] Clean up start topic in TestPubsubSignal (#11072)


------------------------------------------
[...truncated 5.53 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input956be15e-029c-4f29-8700-8d1d7ecc1820"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input956be15e-029c-4f29-8700-8d1d7ecc1820", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output956be15e-029c-4f29-8700-8d1d7ecc1820", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-09T20:32:01.709632Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-09_13_32_00-13832317485121991747'
 location: u'us-central1'
 name: u'beamapp-jenkins-0309203145-553079'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-09T20:32:01.709632Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-09_13_32_00-13832317485121991747]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_32_00-13832317485121991747?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-09_13_32_00-13832317485121991747 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:00.545Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-09_13_32_00-13832317485121991747. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:00.545Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-09_13_32_00-13832317485121991747.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:00.545Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:04.080Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:05.040Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:05.673Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:05.706Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:05.777Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:05.822Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:05.854Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:05.890Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:05.925Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:05.977Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:06.019Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:06.050Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:06.095Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:06.130Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:06.165Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:06.199Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:08.436Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:08.466Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:08.507Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:15.223Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T20:32:32.749Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-09_13_32_00-13832317485121991747 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2222.715s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_32_02-8110772332789235819?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_41_37-10888680600400630909?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_50_53-168388248779335778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_14_00_04-14121060343950084367?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_31_59-6723118919002473390?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_40_54-15697631317760167910?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_49_55-5176046175120195672?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_32_00-13832317485121991747?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_39_54-3476668051265487044?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_49_00-127790619240379245?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_32_00-878299011712034646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_41_00-3672157494428589074?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_50_01-11632573267805316529?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_32_00-13943973157492019134?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_41_01-7520050107262659952?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_50_14-470931312781975686?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_31_59-7649242738850178291?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_42_13-15195550310467421951?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_32_01-13882366796389318779?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_41_10-7191423970986564741?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_49_52-425339114563811531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_31_59-18440597756633669202?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_40_45-18337741515852839862?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_13_49_47-14330535542878626684?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 17s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/7uzsee3ojl7hu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #77

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/77/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-9288] Update to use vendored gRPC without shaded conscrypt


------------------------------------------
[...truncated 5.50 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input668ade1f-ac13-4325-a312-8333897ed5aa"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input668ade1f-ac13-4325-a312-8333897ed5aa", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output668ade1f-ac13-4325-a312-8333897ed5aa", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-09T18:05:09.508286Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-09_11_05_08-12192122854316297970'
 location: u'us-central1'
 name: u'beamapp-jenkins-0309180450-051222'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-09T18:05:09.508286Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-09_11_05_08-12192122854316297970]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_05_08-12192122854316297970?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-09_11_05_08-12192122854316297970 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:08.493Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:08.493Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-09_11_05_08-12192122854316297970. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:08.493Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-09_11_05_08-12192122854316297970.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:11.513Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:12.650Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.520Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.564Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.636Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.678Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.715Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.747Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.850Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.903Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.934Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:13.969Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:14.023Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:14.063Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:14.094Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:14.128Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:28.328Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:28.364Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:28.400Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:31.192Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T18:05:52.386Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-09_11_05_08-12192122854316297970 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2127.363s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_05_08-12192122854316297970?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_12_43-4682086940148744468?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_21_37-14304331323710203920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_05_06-18360889397358222514?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_14_06-13538195370582131470?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_23_23-11019979970236045557?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_05_06-7807797421716014335?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_14_30-10288804893441034322?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_23_23-9719706555710990392?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_31_13-5520894171022143570?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_05_04-3025234967430348912?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_14_01-14195529232861725517?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_22_51-7831787959849360200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_05_07-15477280125260141021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_14_27-684389954297403379?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_05_07-7164041166058445101?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_14_03-17905039508195643307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_22_58-7211147494442364194?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_05_05-6787652844564431942?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_13_43-10536013164154075693?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_21_32-8417177813969808971?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_05_05-11322706731780746952?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_13_09-6879300630852566167?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_11_22_14-7712396601928710600?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 7s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/hdkmy6dmytiwe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #76

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/76/display/redirect?page=changes>

Changes:

[github] [BEAM-9396] Fix Docker image name in CoGBK test for Python on Flink


------------------------------------------
[...truncated 5.52 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2bd71df4-b0f2-49a0-979f-368278f42cf6"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2bd71df4-b0f2-49a0-979f-368278f42cf6", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output2bd71df4-b0f2-49a0-979f-368278f42cf6", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-09T14:23:50.722645Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-09_07_23_49-18053062136888864895'
 location: u'us-central1'
 name: u'beamapp-jenkins-0309142336-346059'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-09T14:23:50.722645Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-09_07_23_49-18053062136888864895]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_23_49-18053062136888864895?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-09_07_23_49-18053062136888864895 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:49.764Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:49.764Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-09_07_23_49-18053062136888864895. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:49.764Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-09_07_23_49-18053062136888864895.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:53.395Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:54.121Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:54.771Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:54.799Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:54.859Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:54.902Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:54.931Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:54.957Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:54.986Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:55.048Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:55.076Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:55.099Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:55.130Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:55.160Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:55.185Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:55.214Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:57.573Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:57.604Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:23:57.649Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:24:20.932Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:24:25.283Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:24:56.237Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T14:24:56.280Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-09_07_23_49-18053062136888864895 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2143.146s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_23_48-14934911648249503852?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_32_38-5724243397413348078?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_42_02-11427946962734373841?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_51_30-2487018471595295469?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_23_49-18053062136888864895?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_31_30-6906769968887808699?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_40_28-16475811632965123143?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_23_49-4755869387825193252?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_32_42-11491843082979886119?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_41_56-445383370708321292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_23_49-9176290142641681370?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_32_29-16263201480567758047?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_41_43-1483396505677213626?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_23_47-16520008440429851307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_32_32-7417814540236496320?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_23_49-8762844308482648000?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_32_33-11541422008999318021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_40_52-2843980605241349242?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_23_51-4951022558567452021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_32_35-17957341523637463210?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_42_04-7662244200532345882?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_23_48-9543756535409946558?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_31_48-10384544073176075627?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_07_40_52-17089333818935091236?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 14s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/slo7q4snhpbue

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #75

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/75/display/redirect>

Changes:


------------------------------------------
[...truncated 5.52 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inpute69d8c63-7a36-4042-8f07-2fe79d448cba"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inpute69d8c63-7a36-4042-8f07-2fe79d448cba", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outpute69d8c63-7a36-4042-8f07-2fe79d448cba", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-09T12:45:10.678297Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-09_05_45_09-6778769413780804338'
 location: u'us-central1'
 name: u'beamapp-jenkins-0309124455-414022'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-09T12:45:10.678297Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-09_05_45_09-6778769413780804338]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_45_09-6778769413780804338?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-09_05_45_09-6778769413780804338 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:09.271Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:09.271Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-09_05_45_09-6778769413780804338.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:09.271Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-09_05_45_09-6778769413780804338. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:12.712Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:13.583Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.238Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.265Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.337Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.368Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.403Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.438Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.462Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.513Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.539Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.574Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.603Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.632Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.662Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:14.685Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:16.948Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:16.981Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:17.018Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:40.643Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:45:46.359Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:46:11.717Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T12:46:11.751Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-09_05_45_09-6778769413780804338 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2108.421s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_45_07-8310953667306285168?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_53_48-14460246027869733395?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_06_02_48-3136031620109903130?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_06_11_58-8625440599513709484?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_45_07-15059529688900242663?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_54_04-11859815310434790578?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_45_09-6778769413780804338?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_52_27-2007473257876384144?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_06_00_22-12635013191210149974?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_45_10-8198316058777425491?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_54_05-1541832932171951376?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_06_02_12-17367424432125472815?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_45_08-13815917615994916833?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_52_49-3196883922062538310?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_06_02_01-5189046465845832500?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_45_07-7880116784914816788?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_53_16-4672351673613096180?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_06_01_17-1235854194472180009?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_45_09-1648981504331380756?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_53_15-4914076051890088601?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_06_02_06-1829698895580983534?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_45_09-14341604996955918513?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_05_53_14-5809490965570522298?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_06_02_00-1143409707184212878?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 11s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/x3wbda6vdmj2k

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #74

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/74/display/redirect>

Changes:


------------------------------------------
[...truncated 5.52 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputd07698ed-946f-483f-b7dc-0553c8fc8387"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputd07698ed-946f-483f-b7dc-0553c8fc8387", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputd07698ed-946f-483f-b7dc-0553c8fc8387", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-09T06:41:36.193114Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-08_23_41_35-12077188970990681195'
 location: u'us-central1'
 name: u'beamapp-jenkins-0309064120-348585'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-09T06:41:36.193114Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-08_23_41_35-12077188970990681195]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_41_35-12077188970990681195?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-08_23_41_35-12077188970990681195 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:35.231Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-08_23_41_35-12077188970990681195. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:35.231Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-08_23_41_35-12077188970990681195.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:35.231Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:38.503Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:39.307Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:39.892Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:39.924Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:39.994Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.039Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.076Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.115Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.202Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.242Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.266Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.296Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.340Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.373Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.417Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:40.450Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:42.676Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:42.716Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:41:42.741Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:42:06.728Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T06:42:13.107Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-08_23_41_35-12077188970990681195 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2201.636s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_41_34-8242388339744664063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_50_38-2315224566349581511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_59_58-11958949973780790023?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-09_00_09_17-13162602195621015290?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_41_32-6116511190998410724?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_50_52-8200681710990600030?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_59_27-3659968173895720652?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_41_35-12077188970990681195?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_49_52-15576216201921832387?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_59_12-12980334952149757922?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_41_34-2862847564846474068?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_49_29-13747363873929518182?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_58_44-18215441513630692070?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_41_34-15823720861627378999?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_50_05-372448340208867074?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_58_51-15268406004902597150?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_41_32-10793850012348278864?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_51_02-13930113519264797093?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_41_34-7553160645311474556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_51_09-14741005762881034620?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_59_32-6634894897432660177?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_41_34-6600809508767955840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_49_46-12597455541946941002?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_23_58_10-1923854602093402566?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 37s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/afftretj2imva

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #73

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/73/display/redirect>

Changes:


------------------------------------------
[...truncated 5.54 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputc8de80a6-2d92-4f3d-bd2e-faf3b38d1551"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputc8de80a6-2d92-4f3d-bd2e-faf3b38d1551", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputc8de80a6-2d92-4f3d-bd2e-faf3b38d1551", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-09T00:43:36.223289Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-08_17_43_34-4010603727566101902'
 location: u'us-central1'
 name: u'beamapp-jenkins-0309004321-241015'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-09T00:43:36.223289Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-08_17_43_34-4010603727566101902]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_43_34-4010603727566101902?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-08_17_43_34-4010603727566101902 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:35.026Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:35.026Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-08_17_43_34-4010603727566101902.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:35.026Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-08_17_43_34-4010603727566101902. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:38.475Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:39.422Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:39.987Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.028Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.113Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.183Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.223Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.261Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.314Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.425Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.471Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.513Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.570Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.611Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.648Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:40.702Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:45.792Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:45.832Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:43:45.860Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:44:02.699Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-09T00:44:14.970Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-08_17_43_34-4010603727566101902 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2152.527s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_43_30-13470297312205881050?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_52_30-7294524429874435737?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_18_01_31-222401857434364939?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_18_10_26-1363932312922042887?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_43_34-4010603727566101902?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_51_30-5973292605360327298?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_18_00_35-11925240865495275309?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_43_32-12289156992135484593?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_52_41-7489440930533464064?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_18_01_31-11879393809090189770?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_43_32-17708664043699660688?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_51_39-17532664080458437900?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_18_01_29-3458972573817845113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_43_30-8937275690306557027?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_52_29-2986829940991267947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_18_01_18-8036075960696753426?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_43_32-16118827604442305646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_52_40-2003159165660265354?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_43_33-12943352538300842061?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_51_23-12511494309539411777?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_18_00_34-9963793163490286127?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_43_31-17712804640982818417?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_17_52_27-1864772074385360758?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_18_01_21-12217132517163433934?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 10s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/q64vjabsspbok

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #72

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/72/display/redirect>

Changes:


------------------------------------------
[...truncated 5.52 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input5e4ba310-8245-4c58-b6fc-616413a432b2"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input5e4ba310-8245-4c58-b6fc-616413a432b2", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output5e4ba310-8245-4c58-b6fc-616413a432b2", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-08T18:45:22.523451Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-08_11_45_21-3022323834362634947'
 location: u'us-central1'
 name: u'beamapp-jenkins-0308184505-308482'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-08T18:45:22.523451Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-08_11_45_21-3022323834362634947]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_45_21-3022323834362634947?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-08_11_45_21-3022323834362634947 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:21.574Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-08_11_45_21-3022323834362634947.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:21.574Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:21.574Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-08_11_45_21-3022323834362634947. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:24.497Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:25.300Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:25.917Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:25.951Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.060Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.110Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.140Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.171Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.206Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.251Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.280Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.385Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.417Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.454Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.494Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:26.528Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:28.770Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:28.801Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:28.837Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:45:54.228Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:46:04.530Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:46:28.310Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T18:46:28.339Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-08_11_45_21-3022323834362634947 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2165.493s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_45_21-3022323834362634947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_53_34-6316947204548765720?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_12_03_23-11015856579203996435?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_12_12_27-8178725118108072436?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_45_18-14616303521799637519?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_54_16-5614723503055311773?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_12_03_04-8035380471680992053?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_45_20-6598194481123285292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_54_27-3817367146697742044?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_12_03_18-11385653618265231722?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_45_20-10727485460089006983?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_54_13-7200182893126296684?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_12_02_40-6906544618545731142?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_45_18-14190541220874621840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_54_07-1450814534252078244?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_12_03_01-5779987998279026978?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_45_19-14540861956904970574?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_54_15-11225118544144901973?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_45_20-5503524323378169814?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_54_14-3349096920428656998?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_12_02_57-4559474770890001832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_45_18-16921133614391853852?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_11_54_02-12827990519891554502?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_12_03_01-8392971590181386693?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 14s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/n4cslkbxnejmy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #71

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/71/display/redirect>

Changes:


------------------------------------------
[...truncated 5.54 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input867e532c-3b50-4480-ad82-8db42e802ffa"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input867e532c-3b50-4480-ad82-8db42e802ffa", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output867e532c-3b50-4480-ad82-8db42e802ffa", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-08T12:42:13.386882Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-08_05_42_12-17318942307865568948'
 location: u'us-central1'
 name: u'beamapp-jenkins-0308124157-220124'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-08T12:42:13.386882Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-08_05_42_12-17318942307865568948]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_42_12-17318942307865568948?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-08_05_42_12-17318942307865568948 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:12.364Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-08_05_42_12-17318942307865568948. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:12.364Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:12.365Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-08_05_42_12-17318942307865568948.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:16.236Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:17.086Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:17.748Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:17.785Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:17.852Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:17.893Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:17.923Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:18.051Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:18.091Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:18.155Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:18.195Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:18.228Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:18.269Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:18.309Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:18.347Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:18.381Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:32.691Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:32.719Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:32.755Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:42:51.659Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T12:43:00.558Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-08_05_42_12-17318942307865568948 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2177.091s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_42_11-9200316696868147264?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_51_19-11882656742554012553?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_06_00_25-11313483263277148385?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_06_09_10-10389199643261872830?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_42_12-17318942307865568948?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_49_58-8447454373302564861?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_59_07-9298935973013926961?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_42_12-2166803640208368669?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_51_07-13549378499675109732?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_06_00_06-8700478562089979220?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_42_12-2199430046801150132?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_50_39-6415664237655088918?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_59_42-1434599836500375377?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_42_09-4129708391926676817?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_50_59-13451013414249805774?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_06_00_12-4322720107247837045?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_42_11-15224126843123293583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_50_10-12056733491773193983?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_59_30-3839021750000006357?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_42_12-6849805236167182122?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_51_12-9019385670335069868?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_42_10-15795582987534895774?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_50_50-17404293798956669517?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-08_05_58_49-9793929655526895542?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 17s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/rvdmc546faxgg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #70

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/70/display/redirect>

Changes:


------------------------------------------
[...truncated 43.28 KB...]
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>

> Task :sdks:java:harness:shadowJar

> Task :sdks:python:sdist
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_urns.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:google-cloud-dataflow-java:worker:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:worker:classes

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Processing <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz>
Processing /home/jenkins/.cache/pip/wheels/50/24/4d/4580ca4a299f1ad6fd63443e6e584cb21e9a07988e4aa8daac/crcmod-1.7-cp27-cp27mu-linux_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/59/b1/91/f02e76c732915c4015ab4010f3015469866c1eb9b14058d8e7/dill-0.3.1.1-cp27-none-any.whl
Collecting fastavro<0.22,>=0.21.4
  Using cached fastavro-0.21.24-cp27-cp27mu-manylinux1_x86_64.whl (1.0 MB)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (1.27.2)
Processing /home/jenkins/.cache/pip/wheels/fe/a7/05/23e3699975fc20f8a30e00ac1e515ab8c61168e982abe4ce70/hdfs-2.5.8-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/6d/41/4b/2b369d6e2b7eaebcdd423516d3fb659c7658c16a2be8fd04ec/httplib2-0.12.0-cp27-none-any.whl
Collecting mock<3.0.0,>=1.0.1
  Using cached mock-2.0.0-py2.py3-none-any.whl (56 kB)
Collecting numpy<2,>=1.14.3
  Using cached numpy-1.16.6-cp27-cp27mu-manylinux1_x86_64.whl (17.0 MB)
Collecting pymongo<4.0.0,>=3.8.0
  Using cached pymongo-3.10.1-cp27-cp27mu-manylinux1_x86_64.whl (444 kB)
Processing /home/jenkins/.cache/pip/wheels/48/f7/87/b932f09c6335dbcf45d916937105a372ab14f353a9ca431d7d/oauth2client-3.0.0-cp27-none-any.whl
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.11.3)
Collecting pydot<2,>=1.2.0
  Using cached pydot-1.4.1-py2.py3-none-any.whl (19 kB)
Collecting python-dateutil<3,>=2.8.0
  Using cached python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting pytz>=2018.3
  Using cached pytz-2019.3-py2.py3-none-any.whl (509 kB)
Processing /home/jenkins/.cache/pip/wheels/51/14/c1/d4e383d261ced6c549ea2d072cc3a3955744948d9b0d2698f6/avro-1.9.2-cp27-none-any.whl
Collecting funcsigs<2,>=1.0.2
  Using cached funcsigs-1.0.2-py2.py3-none-any.whl (17 kB)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.3.0)
Processing /home/jenkins/.cache/pip/wheels/81/91/41/3272543c0b9c61da9c525f24ee35bae6fe8f60d4858c66805d/PyVCF-0.6.8-cp27-none-any.whl
Collecting pyarrow<0.16.0,>=0.15.1
  Using cached pyarrow-0.15.1-cp27-cp27mu-manylinux2010_x86_64.whl (17.5 MB)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.7.4.1)
Collecting typing-extensions<3.8.0,>=3.7.0
  Using cached typing_extensions-3.7.4.1-py2-none-any.whl (9.0 kB)
Collecting cachetools<4,>=3.1.0
  Using cached cachetools-3.1.1-py2.py3-none-any.whl (11 kB)
Collecting google-apitools<0.5.29,>=0.5.28
  Using cached google_apitools-0.5.28-py2-none-any.whl (134 kB)
Collecting google-cloud-datastore<1.8.0,>=1.7.1
  Using cached google_cloud_datastore-1.7.4-py2.py3-none-any.whl (82 kB)
Collecting google-cloud-pubsub<1.1.0,>=0.39.0
  Using cached google_cloud_pubsub-1.0.2-py2.py3-none-any.whl (118 kB)
Collecting google-cloud-bigquery<=1.24.0,>=1.6.0
  Using cached google_cloud_bigquery-1.24.0-py2.py3-none-any.whl (165 kB)
Collecting google-cloud-core<2,>=0.28.1
  Using cached google_cloud_core-1.3.0-py2.py3-none-any.whl (26 kB)
Collecting google-cloud-bigtable<1.1.0,>=0.31.1
  Using cached google_cloud_bigtable-1.0.0-py2.py3-none-any.whl (232 kB)
Collecting google-cloud-spanner<1.14.0,>=1.13.0
  Using cached google_cloud_spanner-1.13.0-py2.py3-none-any.whl (212 kB)
Collecting grpcio-gcp<1,>=0.2.2
  Using cached grpcio_gcp-0.2.2-py2.py3-none-any.whl (9.4 kB)
Collecting google-cloud-dlp<=0.13.0,>=0.12.0
  Using cached google_cloud_dlp-0.13.0-py2.py3-none-any.whl (151 kB)
Collecting google-cloud-language<2,>=1.3.0
  Using cached google_cloud_language-1.3.0-py2.py3-none-any.whl (83 kB)
Collecting google-cloud-videointelligence<1.14.0,>=1.8.0
  Using cached google_cloud_videointelligence-1.13.0-py2.py3-none-any.whl (177 kB)
Collecting google-cloud-vision<0.43.0,>=0.38.0
  Using cached google_cloud_vision-0.42.0-py2.py3-none-any.whl (435 kB)
Processing /home/jenkins/.cache/pip/wheels/09/61/a5/7e8f4442b3c3d406ee9eb6c06e1ecbe5625f62f8cb19c08f5b/googledatastore-7.0.2-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/bd/ce/33/8b769968db3761c42c7a91d8a0dbbafc50acfa0750866c8abd/proto_google_cloud_datastore_v1-0.90.4-cp27-none-any.whl
Collecting freezegun>=0.3.12
  Using cached freezegun-0.3.15-py2.py3-none-any.whl (14 kB)
Collecting nose>=1.3.7
  Using cached nose-1.3.7-py2-none-any.whl (154 kB)
Processing /home/jenkins/.cache/pip/wheels/c4/1f/cd/9250fbf2fcc179e28bb4f7ee26a4fc7525914469d83a4f0c09/nose_xunitmp-0.4.1-cp27-none-any.whl
Collecting pandas<0.25,>=0.23.4
  Using cached pandas-0.24.2-cp27-cp27mu-manylinux1_x86_64.whl (10.1 MB)
Collecting parameterized<0.8.0,>=0.7.1
  Using cached parameterized-0.7.1-py2.py3-none-any.whl (24 kB)
Processing /home/jenkins/.cache/pip/wheels/8f/88/5d/2d12b9e226ee11ce171a603275d8dd6546d93202466b7fe173/PyHamcrest-1.10.1-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/e4/76/4d/a95b8dd7b452b69e8ed4f68b69e1b55e12c9c9624dd962b191/PyYAML-5.3-cp27-cp27mu-linux_x86_64.whl
Collecting requests_mock<2.0,>=1.7
  Using cached requests_mock-1.7.0-py2.py3-none-any.whl (23 kB)
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting pytest<5.0,>=4.4.0
  Using cached pytest-4.6.9-py2.py3-none-any.whl (231 kB)
Collecting pytest-xdist<2,>=1.29.0
  Using cached pytest_xdist-1.31.0-py2.py3-none-any.whl (36 kB)
Collecting pytest-timeout<2,>=1.3.3
  Using cached pytest_timeout-1.3.4-py2.py3-none-any.whl (10 kB)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.21.0.dev0) (1.14.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.21.0.dev0) (1.1.9)
Processing /home/jenkins/.cache/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e/docopt-0.6.2-py2.py3-none-any.whl
Collecting requests>=2.7.0
  Using cached requests-2.23.0-py2.py3-none-any.whl (58 kB)
Collecting pbr>=0.11
  Using cached pbr-5.4.4-py2.py3-none-any.whl (110 kB)
Collecting rsa>=3.1.4
  Using cached rsa-4.0-py2.py3-none-any.whl (38 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting pyasn1-modules>=0.0.5
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.21.0.dev0) (44.0.0)
Collecting pyparsing>=2.1.4
  Using cached pyparsing-2.4.6-py2.py3-none-any.whl (67 kB)
Collecting fasteners>=0.14
  Using cached fasteners-0.15-py2.py3-none-any.whl (23 kB)
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0
  Using cached google_api_core-1.16.0-py2.py3-none-any.whl (70 kB)
Processing /home/jenkins/.cache/pip/wheels/de/3a/83/77a1e18e1a8757186df834b86ce6800120ac9c79cd8ca4091b/grpc_google_iam_v1-0.12.3-cp27-none-any.whl
Collecting google-auth<2.0dev,>=1.9.0
  Using cached google_auth-1.11.2-py2.py3-none-any.whl (76 kB)
Collecting google-resumable-media<0.6dev,>=0.5.0
  Using cached google_resumable_media-0.5.0-py2.py3-none-any.whl (38 kB)
Processing /home/jenkins/.cache/pip/wheels/2c/f9/7f/6eb87e636072bf467e25348bbeb96849333e6a080dca78f706/googleapis_common_protos-1.51.0-cp27-none-any.whl
Collecting monotonic>=0.6; python_version == "2.7"
  Using cached monotonic-1.5-py2.py3-none-any.whl (5.3 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.3.0-py2.py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (0.13.1)
Collecting packaging
  Using cached packaging-20.3-py2.py3-none-any.whl (37 kB)
Collecting attrs>=17.4.0
  Using cached attrs-19.3.0-py2.py3-none-any.whl (39 kB)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (1.5.0)
Collecting wcwidth
  Using cached wcwidth-0.1.8-py2.py3-none-any.whl (17 kB)
Collecting more-itertools<6.0.0,>=4.0.0; python_version <= "2.7"
  Using cached more_itertools-5.0.0-py2-none-any.whl (52 kB)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (2.3.5)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (1.8.1)
Collecting pytest-forked
  Using cached pytest_forked-1.1.3-py2.py3-none-any.whl (4.5 kB)
Collecting execnet>=1.1
  Using cached execnet-1.7.1-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2019.11.28-py2.py3-none-any.whl (156 kB)
Collecting idna<3,>=2.5
  Using cached idna-2.9-py2.py3-none-any.whl (58 kB)
Collecting chardet<4,>=3.0.2
  Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Using cached urllib3-1.25.8-py2.py3-none-any.whl (125 kB)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (1.2.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (1.10.0)
Collecting apipkg>=1.4
  Using cached apipkg-1.5-py2.py3-none-any.whl (4.9 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started

> Task :runners:google-cloud-dataflow-java:worker:shadowJar

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.21.0.dev0-py2-none-any.whl size=1966137 sha256=5b61e620ba439a5ecc6cf11480a786d28ad77064a9a182644159e098a82a6b9f
  Stored in directory: /home/jenkins/.cache/pip/wheels/ad/b9/f4/76cfa08e37c58977b75eacbec4ff8f3ac2a0a8e409dca0ed3a
Successfully built apache-beam
Installing collected packages: crcmod, dill, fastavro, docopt, certifi, idna, chardet, urllib3, requests, hdfs, httplib2, pbr, funcsigs, mock, numpy, pymongo, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, python-dateutil, pytz, avro, pyvcf, pyarrow, typing-extensions, cachetools, monotonic, fasteners, google-apitools, google-auth, googleapis-common-protos, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, google-cloud-spanner, grpcio-gcp, google-cloud-dlp, google-cloud-language, google-cloud-videointelligence, google-cloud-vision, proto-google-cloud-datastore-v1, googledatastore, freezegun, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, tenacity, atomicwrites, packaging, attrs, wcwidth, more-itertools, pytest, pytest-forked, apipkg, execnet, pytest-xdist, pytest-timeout, apache-beam
ERROR: Could not install packages due to an EnvironmentError: [Errno 28] No space left on device


> Task :sdks:python:test-suites:dataflow:py2:installGcpTest FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 12s
62 actionable tasks: 44 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/cbv46d3xl4d5c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #69

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/69/display/redirect>

Changes:


------------------------------------------
[...truncated 5.55 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputf6bedb96-8d4d-49d5-a5d8-d8b6a84a1ea6"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputf6bedb96-8d4d-49d5-a5d8-d8b6a84a1ea6", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputf6bedb96-8d4d-49d5-a5d8-d8b6a84a1ea6", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-08T00:43:53.345342Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-07_16_43_52-11041438174221364423'
 location: u'us-central1'
 name: u'beamapp-jenkins-0308004335-696943'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-08T00:43:53.345342Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-07_16_43_52-11041438174221364423]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_43_52-11041438174221364423?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-07_16_43_52-11041438174221364423 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:43:52.127Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:43:52.128Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-07_16_43_52-11041438174221364423.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:43:52.128Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-07_16_43_52-11041438174221364423. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:43:57.922Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:43:59.078Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:43:59.856Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:43:59.934Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.125Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.200Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.236Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.271Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.309Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.360Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.390Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.425Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.470Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.499Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.536Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:00.570Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:03.124Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:03.227Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:03.301Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:11.827Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:27.507Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:54.407Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-08T00:44:54.447Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-07_16_43_52-11041438174221364423 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2288.020s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_43_49-7219036768968727905?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_53_55-2792540505086870718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_17_02_44-520960735965173620?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_17_11_48-3545080116169714677?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_43_49-13934030829557332266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_52_25-4743371997574361716?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_17_01_25-2076423231769986146?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_43_52-11041438174221364423?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_51_26-4895155331639358992?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_17_00_46-15501231530961996257?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_43_51-12632295553962477786?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_51_42-9488816445704802868?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_17_00_49-8971512364470741091?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_43_50-11074459201372704119?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_52_39-14323012014956598868?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_17_01_16-12629644839671423297?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_43_47-6671240559922143421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_52_39-13397463061301913700?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_17_01_36-539504388674705102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_43_49-7334883334774944913?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_52_49-6455208674617868911?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_17_01_39-5844788302953041962?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_43_51-16667373903809562637?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_16_54_05-16296959753117929704?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 41s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/xnuyf6xhcwj4g

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #68

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/68/display/redirect>

Changes:


------------------------------------------
[...truncated 5.51 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input6017d59d-375f-4dad-b386-04bf73fcb2b0"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input6017d59d-375f-4dad-b386-04bf73fcb2b0", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output6017d59d-375f-4dad-b386-04bf73fcb2b0", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-07T18:46:18.209908Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-07_10_46_17-12235103920399117019'
 location: u'us-central1'
 name: u'beamapp-jenkins-0307184601-174969'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-07T18:46:18.209908Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-07_10_46_17-12235103920399117019]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_46_17-12235103920399117019?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-07_10_46_17-12235103920399117019 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:17.110Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:17.110Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-07_10_46_17-12235103920399117019. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:17.110Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-07_10_46_17-12235103920399117019.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:22.431Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:23.259Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:23.887Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:23.955Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.023Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.066Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.098Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.133Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.172Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.225Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.255Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.288Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.331Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.362Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.399Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:24.437Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:26.906Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:26.993Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:27.083Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:41.164Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:46:51.909Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:47:28.948Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T18:47:28.977Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-07_10_46_17-12235103920399117019 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2198.179s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_46_15-15732199457181637063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_56_04-15680791236994289186?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_11_04_54-1133892607858513095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_11_13_43-11201958799064435003?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_46_15-13444966276118799377?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_55_15-12588302654984540018?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_11_04_20-18041367327939424609?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_46_17-12235103920399117019?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_54_02-11783174830683797789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_11_02_57-17493888190980498066?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_46_18-18096875376882655368?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_55_22-14569835034337827533?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_11_04_11-7414354307422813674?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_46_16-1429789850237262185?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_55_16-4575382935453325879?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_11_04_10-16546385615737619047?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_46_14-4033721831490229908?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_55_10-17828881461691750055?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_46_15-4823234428864695290?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_55_07-5520772544157972314?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_11_04_30-2559926279401002930?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_46_16-8563048521711083841?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_10_55_13-9709133556079709043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_11_04_07-10610511383396245214?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 18s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/j6o2clvajnwk2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #67

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/67/display/redirect>

Changes:


------------------------------------------
[...truncated 5.53 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input28f294bd-1230-42c0-afde-aa971abac9d3"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input28f294bd-1230-42c0-afde-aa971abac9d3", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output28f294bd-1230-42c0-afde-aa971abac9d3", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-07T12:42:26.966617Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-07_04_42_25-16094748905513539073'
 location: u'us-central1'
 name: u'beamapp-jenkins-0307124208-964397'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-07T12:42:26.966617Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-07_04_42_25-16094748905513539073]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_42_25-16094748905513539073?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-07_04_42_25-16094748905513539073 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:25.861Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:25.861Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-07_04_42_25-16094748905513539073. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:25.861Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-07_04_42_25-16094748905513539073.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:30.558Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:31.630Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.166Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.194Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.260Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.307Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.343Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.382Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.416Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.479Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.515Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.539Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.575Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.603Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.629Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:32.656Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:36.920Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:36.945Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:36.980Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:42:45.799Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T12:43:04.391Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-07_04_42_25-16094748905513539073 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2286.711s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_42_22-10400885727583088500?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_51_19-13688709093765798189?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_05_01_26-3526998153145399153?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_05_10_30-3317265007883748112?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_42_22-18162737402741848500?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_51_24-15918056339766484158?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_05_00_25-17783129397437405616?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_42_25-16094748905513539073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_50_20-3992743528789071134?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_59_20-14539983696321318724?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_42_23-14469166098822349049?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_51_29-3366573772192033782?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_05_00_40-8751209407081156454?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_42_24-7471089325665349245?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_51_39-15051713629774551124?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_05_00_24-4077915358245152406?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_42_23-18268306440409235036?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_51_26-13775682182255128795?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_05_00_42-15479587733515211641?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_42_25-12745169688376312502?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_52_45-440157402769550954?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_42_23-1242048719579215865?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_04_51_33-6327724171163573417?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-07_05_00_48-1474253758708999344?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 9s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/fceno6i6j5acu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #66

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/66/display/redirect>

Changes:


------------------------------------------
[...truncated 5.51 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputaed085ff-b94f-41e6-9eb5-78664aa41d33"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputaed085ff-b94f-41e6-9eb5-78664aa41d33", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputaed085ff-b94f-41e6-9eb5-78664aa41d33", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-07T06:44:02.547873Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-06_22_44_01-11260798018251475342'
 location: u'us-central1'
 name: u'beamapp-jenkins-0307064344-632660'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-07T06:44:02.547873Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-06_22_44_01-11260798018251475342]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_44_01-11260798018251475342?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-06_22_44_01-11260798018251475342 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:01.485Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-06_22_44_01-11260798018251475342. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:01.485Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:01.485Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-06_22_44_01-11260798018251475342.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:04.691Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:05.504Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.029Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.143Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.194Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.225Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.255Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.279Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.310Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.350Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.374Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.398Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.431Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.455Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.477Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:06.499Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:08.731Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:08.752Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:08.775Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:26.612Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T06:44:39.233Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-06_22_44_01-11260798018251475342 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2197.718s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_43_58-16181605903597954887?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_52_44-18323166912148796987?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_23_01_40-4681243259026962658?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_44_00-13108475047715487846?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_53_44-15008421388756853134?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_44_01-6309600700226543654?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_53_45-4229161946810202419?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_23_02_46-7724273297752540414?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_43_58-3419085931482971643?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_52_55-1728846750221225195?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_23_02_06-13807754690821249997?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_44_01-11260798018251475342?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_51_42-10066417932506356361?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_23_00_52-258231912165177863?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_44_02-7610249791020539574?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_52_53-3491025546901166531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_23_02_02-5532013352900315316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_43_59-3913045748183807464?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_53_48-14626012197929343704?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_23_03_02-202935125102544008?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_23_11_48-7565823965258006223?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_43_59-5502976598944749778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_22_52_46-11217118221671854721?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_23_01_50-4972037134384985940?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 11s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ztivrka327koy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #65

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/65/display/redirect?page=changes>

Changes:

[chadrik] [BEAM-9274] Support running yapf in a git pre-commit hook

[lcwik] [BEAM-9464] Fix WithKeys to respect parameterized types

[ankurgoenka] [BEAM-9465] Fire repeatedly in reshuffle


------------------------------------------
[...truncated 5.51 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputdc22f085-b2a8-4c38-bc79-0f6ae3a823db"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputdc22f085-b2a8-4c38-bc79-0f6ae3a823db", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputdc22f085-b2a8-4c38-bc79-0f6ae3a823db", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-07T02:42:12.802298Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-06_18_42_11-14985197190116138177'
 location: u'us-central1'
 name: u'beamapp-jenkins-0307024151-537961'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-07T02:42:12.802298Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-06_18_42_11-14985197190116138177]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_42_11-14985197190116138177?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-06_18_42_11-14985197190116138177 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:11.578Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-06_18_42_11-14985197190116138177.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:11.578Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-06_18_42_11-14985197190116138177. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:11.578Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:15.221Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:17.301Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:17.933Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:17.972Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.046Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.097Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.129Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.165Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.194Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.250Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.293Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.325Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.374Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.411Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.457Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:18.494Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:51.104Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:56.868Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:56.922Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:42:56.991Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T02:43:22.486Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-06_18_42_11-14985197190116138177 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2263.622s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_42_10-16120345878941683041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_52_20-17436308574703889819?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_19_01_10-11201647363542394266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_19_10_09-12656118856791424103?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_42_12-4373956936261022295?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_51_33-5437107379351798549?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_42_11-14985197190116138177?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_50_52-15685918970349381037?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_59_52-12494869386132705772?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_42_12-15864250684825381135?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_51_15-7924971091334793080?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_59_59-7782266395210406202?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_42_14-983428110909238054?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_51_17-12708610319213174711?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_19_00_11-11388328212317181351?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_42_11-4177654494659778954?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_52_13-16780185140255942073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_19_01_00-17891741111605308237?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_42_14-2437860820524369603?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_51_17-14077822303359894748?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_19_00_16-13375847192914685806?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_42_10-17584373886209481183?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_18_51_31-14402793873158135028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_19_00_56-16742895417822965590?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 22m 30s
64 actionable tasks: 59 executed, 5 from cache

Publishing build scan...
https://gradle.com/s/3b7oj6paaurh6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #64

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/64/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-2939, BEAM-9458] Add deduplication transform for SplittableDoFns


------------------------------------------
[...truncated 5.51 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputd0fdffa4-c026-4007-8de4-e515f5b2c843"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputd0fdffa4-c026-4007-8de4-e515f5b2c843", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputd0fdffa4-c026-4007-8de4-e515f5b2c843", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-07T00:37:00.066478Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-06_16_36_58-693336396679475731'
 location: u'us-central1'
 name: u'beamapp-jenkins-0307003643-213715'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-07T00:37:00.066478Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-06_16_36_58-693336396679475731]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_36_58-693336396679475731?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-06_16_36_58-693336396679475731 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:36:58.720Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-06_16_36_58-693336396679475731. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:36:58.720Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-06_16_36_58-693336396679475731.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:36:58.720Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:03.046Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:05.258Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:05.847Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:05.881Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:05.951Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:05.990Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.034Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.070Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.109Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.147Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.180Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.221Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.263Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.293Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.329Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:06.358Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:08.630Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:08.723Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:08.786Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:22.782Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:37:38.920Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:38:09.584Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-07T00:38:09.620Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-06_16_36_58-693336396679475731 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2209.787s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_36_56-2971970744985186820?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_45_26-12612584086888502685?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_54_31-4166485072924258789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_36_58-693336396679475731?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_45_31-4450970511610601922?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_55_38-5173156852897368169?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_17_04_33-468963550876888893?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_36_59-762260134972449752?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_46_13-12793519571962802650?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_55_10-7636578101597305760?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_36_56-8911660771468725387?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_45_06-17100445070761427054?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_54_28-14439398836219204496?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_37_02-10771344641966840805?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_46_05-14978136088926617495?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_55_03-117620613930046832?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_36_55-9652568148231267996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_44_58-8458439226633972582?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_54_51-11674140138667828523?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_36_57-1440862728016738149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_46_09-8005252452298685906?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_36_57-14470842453329263346?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_46_08-5161396532701126097?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_16_55_00-5109214891190325724?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 10s
64 actionable tasks: 48 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/g6qbe4bsictvc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #63

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/63/display/redirect?page=changes>

Changes:

[ehudm] Reduce warnings in pytest runs.

[robertwb] Remove excessive logging.


------------------------------------------
[...truncated 5.56 MB...]
    ]
  }, 
  "name": "beamapp-jenkins-0306212748-331406", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputc62c944d-9ff6-4316-8707-9fd6291b8c4c"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputc62c944d-9ff6-4316-8707-9fd6291b8c4c", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputc62c944d-9ff6-4316-8707-9fd6291b8c4c", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-06T21:28:04.734739Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-06_13_28_03-4246489135245145071'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306212748-331406'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-06T21:28:04.734739Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-06_13_28_03-4246489135245145071]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_28_03-4246489135245145071?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-06_13_28_03-4246489135245145071 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:03.791Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-06_13_28_03-4246489135245145071. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:03.791Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:03.791Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-06_13_28_03-4246489135245145071.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:09.048Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:12.881Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.508Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.537Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.604Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.648Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.678Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.710Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.751Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.802Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.842Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.877Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.929Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.955Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:13.992Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:14.031Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T21:28:40.018Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-06_13_28_03-4246489135245145071 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2257.462s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_28_06-14663072927600832944?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_37_06-3590977674425514877?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_47_24-12230770771705613848?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_56_46-4522140969710611919?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_28_02-40017786993293402?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_37_54-2725540800629810636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_47_01-10825440431965137324?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_28_03-4246489135245145071?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_37_32-16183209505905280149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_46_49-15768301990174075896?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_28_05-3438961247924128343?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_37_15-16001526896869521762?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_47_09-3894880592917027056?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_28_04-5360616413295854593?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_37_27-16161256590710352480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_46_18-14330296398917338466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_28_04-8626601731616051218?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_37_58-9242035053151810683?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_46_58-10164378962742639342?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_28_13-11902700700564214284?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_38_58-18274555886799476132?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_28_03-3660578412406693636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_37_05-5699509409928844544?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_13_46_15-7500222026449076415?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 12s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/qyxtriezaq3kk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #62

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/62/display/redirect>

Changes:


------------------------------------------
[...truncated 5.52 MB...]
    ]
  }, 
  "name": "beamapp-jenkins-0306195639-079136", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inpute28db9f6-1369-4229-8c89-dc4d28b90880"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inpute28db9f6-1369-4229-8c89-dc4d28b90880", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outpute28db9f6-1369-4229-8c89-dc4d28b90880", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-06T19:56:53.287145Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-06_11_56_51-127312638772771337'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306195639-079136'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-06T19:56:53.287145Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-06_11_56_51-127312638772771337]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_11_56_51-127312638772771337?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-06_11_56_51-127312638772771337 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:51.963Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-06_11_56_51-127312638772771337. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:51.963Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-06_11_56_51-127312638772771337.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:51.963Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:57.282Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:58.301Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:58.871Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:58.893Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:58.951Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:58.982Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.015Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.040Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.067Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.108Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.128Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.153Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.186Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.207Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.232Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:56:59.259Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T19:57:24.125Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-06_11_56_51-127312638772771337 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2358.894s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_11_56_51-4107225258641822197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_06_42-16238239305097532199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_11_56_51-127312638772771337?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_06_37-16853640296562987?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_17_56-2102440193847536347?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_26_58-11808076937762151712?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_11_56_53-879709632107882586?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_05_39-13162820352499500024?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_13_56-14143382725780720398?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_11_56_49-17028332796340438275?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_05_47-5276476874326778630?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_15_05-2908782817219418709?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_11_56_51-8931893173339526380?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_07_17-17139537898418282407?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_16_25-12935435981111923046?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_11_56_51-8283746310498557315?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_06_12-6901492111788838250?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_14_58-14072995319714187887?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_11_56_52-2389362753115888287?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_06_21-16116903401795397932?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_15_54-10210612699927482214?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_11_56_50-17852733160864472598?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_06_15-6116615112992263964?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_12_15_11-15661335440516597537?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 53s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/q7jmmyqbqmtku

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #61

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/61/display/redirect?page=changes>

Changes:

[suztomo] Google-cloud-bigquery 1.108.0


------------------------------------------
[...truncated 5.77 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input7970837b-06c4-449b-992c-53f3d9126b91"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input7970837b-06c4-449b-992c-53f3d9126b91", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output7970837b-06c4-449b-992c-53f3d9126b91", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-06T18:10:37.934114Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-06_10_10_36-418183917263642873'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306181013-474319'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-06T18:10:37.934114Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-06_10_10_36-418183917263642873]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_10_36-418183917263642873?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-06_10_10_36-418183917263642873 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:36.913Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-06_10_10_36-418183917263642873.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:36.913Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-06_10_10_36-418183917263642873. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:36.913Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:42.602Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:44.973Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:45.534Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:45.571Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:45.749Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:45.813Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:45.856Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:45.891Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:45.943Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:46.030Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:46.071Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:46.108Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:46.251Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:46.363Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:46.434Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:10:46.479Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:11:13.485Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:11:13.522Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:11:13.560Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:11:22.441Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T18:11:38.713Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-06_10_10_36-418183917263642873 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2373.406s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_10_30-2585597788843846202?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_19_59-17570831872990202937?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_31_23-10197107808900004303?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_40_52-13377721109422685720?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_10_27-9501058198367617721?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_19_51-5528848832603757311?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_10_36-418183917263642873?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_19_03-12245798639696251874?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_28_44-3625244899055120462?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_10_31-13167619557199096034?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_20_02-3943168318910247128?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_29_28-10023925534157825269?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_10_28-18357956628311268264?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_19_55-2365970535810831701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_30_03-10926586215651706550?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_10_28-8566937903407876466?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_19_49-8009772699140186214?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_29_13-18129792421521285553?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_10_28-7668030463528069161?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_19_15-4661354394911930378?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_28_23-13177015262306770691?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_10_27-7737039188853590901?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_18_54-7789314202402160299?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_10_27_19-4260069360461042524?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 22m 31s
64 actionable tasks: 61 executed, 3 from cache

Publishing build scan...
https://gradle.com/s/6hpqhi3iloe52

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #60

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/60/display/redirect?page=changes>

Changes:

[github] Fix a bug in performance test for reading data from BigQuery (#11062)


------------------------------------------
[...truncated 5.78 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input62d555f3-4082-4cbb-838a-fe4096cc4221"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input62d555f3-4082-4cbb-838a-fe4096cc4221", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output62d555f3-4082-4cbb-838a-fe4096cc4221", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-06T14:21:41.170222Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-06_06_21_39-775260322719349908'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306142124-248646'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-06T14:21:41.170222Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-06_06_21_39-775260322719349908]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_21_39-775260322719349908?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-06_06_21_39-775260322719349908 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:40.008Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:40.008Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-06_06_21_39-775260322719349908. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:40.008Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-06_06_21_39-775260322719349908.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:45.383Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:46.573Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.201Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.282Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.450Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.526Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.563Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.607Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.658Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.841Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.899Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.937Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:47.988Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:48.025Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:48.056Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:48.094Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:51.543Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:51.575Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:21:51.606Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:22:12.065Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T14:22:32.628Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-06_06_21_39-775260322719349908 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2267.612s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_21_40-2296379256012916528?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_31_15-377233118537043559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_40_23-5310551332399893247?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_21_40-12910016848501651681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_32_15-17845561703676017050?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_21_42-2945994972406767788?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_31_16-11126667133577487499?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_41_03-4073908831418449474?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_21_39-775260322719349908?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_29_48-11194074789052104685?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_38_37-13340103602861918328?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_21_40-4453044043114024287?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_31_25-2476024577751304192?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_41_25-6098482894044901098?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_49_53-5062483330836252615?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_21_41-3448713655809502531?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_31_25-3598066930725575317?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_40_50-1025673777678515038?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_21_41-7494688615575262727?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_30_34-4771190103162168586?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_39_28-16929100138175648795?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_21_38-10254639164695291723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_31_14-15951297242390162261?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_06_40_28-17456811700036125318?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 25s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/luilq3heqhkvo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #59

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/59/display/redirect>

Changes:


------------------------------------------
[...truncated 5.82 MB...]
    ]
  }, 
  "name": "beamapp-jenkins-0306124402-944244", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input0cab36f7-d6f0-4853-a020-2acdaa2cc126"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input0cab36f7-d6f0-4853-a020-2acdaa2cc126", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output0cab36f7-d6f0-4853-a020-2acdaa2cc126", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-06T12:44:30.740414Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-06_04_44_29-507105235592016638'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306124402-944244'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-06T12:44:30.740414Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-06_04_44_29-507105235592016638]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_44_29-507105235592016638?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-06_04_44_29-507105235592016638 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:29.823Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-06_04_44_29-507105235592016638.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:29.823Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:29.823Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-06_04_44_29-507105235592016638. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:43.663Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:44.715Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.300Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.525Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.587Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.633Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.660Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.692Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.729Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.786Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.816Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.853Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.899Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.945Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:45.969Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:44:46.009Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T12:45:09.885Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-06_04_44_29-507105235592016638 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2413.803s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_44_23-11513429696397206390?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_55_14-11018559876655285202?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_05_05_13-11454359368750849658?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_05_14_07-12698373154303639888?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_44_26-1230845377718380182?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_54_21-2989743330868421737?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_05_03_36-6742409059776756019?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_44_29-507105235592016638?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_56_42-8756623336945112343?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_44_28-16126636163413326131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_53_59-6647267211343336107?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_05_02_49-15993460429701778222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_44_25-2474471722629727390?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_52_33-9505914942355464381?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_05_01_27-10916725545525192164?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_44_25-2434205426754655900?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_54_02-15861175375905299472?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_05_03_57-4545859549412527095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_44_28-6083787003464251200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_54_59-5243115206852054961?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_05_04_23-468097517304653947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_44_24-7198467816156668343?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_04_53_01-7297182086475128185?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_05_02_37-3617401429186120469?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 23m 19s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/zeo3b6hpfwxw6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #58

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/58/display/redirect>

Changes:


------------------------------------------
[...truncated 5.88 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input5639ea73-8865-4a7c-a52f-6a3e7e87f5a6"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input5639ea73-8865-4a7c-a52f-6a3e7e87f5a6", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output5639ea73-8865-4a7c-a52f-6a3e7e87f5a6", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-06T08:01:00.786576Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-06_00_00_59-15301405800244796766'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306080042-670374'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-06T08:01:00.786576Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-06_00_00_59-15301405800244796766]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_00_59-15301405800244796766?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-06_00_00_59-15301405800244796766 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:00:59.726Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:00:59.726Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-06_00_00_59-15301405800244796766.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:00:59.726Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-06_00_00_59-15301405800244796766. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:13.356Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:14.688Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.372Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.406Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.480Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.521Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.556Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.589Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.626Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.688Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.724Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.755Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.804Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.839Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.878Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:15.915Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:18.181Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:18.214Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:18.249Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:31.720Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:01:42.280Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:02:08.202Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T08:02:08.245Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-06_00_00_59-15301405800244796766 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2214.964s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_00_58-10126300484266942351?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_11_18-9191855575403584018?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_20_49-15866805658723635423?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_29_28-6105716361071238743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_00_58-11317011882384127962?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_10_18-5418229258151900411?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_20_19-6531611254856552260?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_00_59-15301405800244796766?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_09_05-14667473053401734794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_18_16-9945544776307072705?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_00_59-4278744138524333316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_09_59-1617116598433757723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_19_06-4383895867023340181?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_00_55-2509717792868635808?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_10_20-14626548199657583072?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_20_41-8080457338215638680?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_00_56-1976929558304237547?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_10_17-12184941395298519578?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_20_08-15103577089850202244?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_00_58-15372578008570214630?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_10_25-8640342979638351192?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_00_56-16708405578687426951?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_09_50-8686770297402042864?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-06_00_18_20-10622443669814930087?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 43s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/jsruyocasl3cc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #57

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/57/display/redirect?page=changes>

Changes:

[github] Merge pull request #11032 from [BEAM-8335] Display rather than logging


------------------------------------------
[...truncated 5.76 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input438c24cc-4b14-45b8-ac5e-1070a2703cfd"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input438c24cc-4b14-45b8-ac5e-1070a2703cfd", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output438c24cc-4b14-45b8-ac5e-1070a2703cfd", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-06T06:39:10.273732Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-05_22_39_09-8280391741191099814'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306063852-143271'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-06T06:39:10.273732Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-05_22_39_09-8280391741191099814]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_39_09-8280391741191099814?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-05_22_39_09-8280391741191099814 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:09.278Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:09.278Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-05_22_39_09-8280391741191099814.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:09.278Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-05_22_39_09-8280391741191099814. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:12.287Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:13.298Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:13.823Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:13.841Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:13.894Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:13.961Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:13.987Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:14.013Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:14.040Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:14.081Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:14.106Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:14.129Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:14.168Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:14.194Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:14.214Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:14.238Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:20.004Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:20.032Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:20.058Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:26.527Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:39:43.820Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:40:10.609Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T06:40:10.631Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-05_22_39_09-8280391741191099814 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2324.950s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_39_07-1431404865722104216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_49_42-6458309062510181666?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_59_02-13926308791863069436?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_23_08_36-1468848302248432730?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_39_05-12966115258039349687?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_47_56-15000641962722245697?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_56_33-12384868234676483657?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_39_09-8280391741191099814?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_46_39-16428954290669971854?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_55_39-13496817717083984954?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_39_08-13953020550682299115?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_49_52-15226783677660426603?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_39_03-14466332303982663673?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_48_03-8378518922554215794?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_57_24-565830793115116908?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_39_05-13831733030883360166?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_48_59-14924205697146773130?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_57_47-15206772530685556667?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_39_07-10521824682771755881?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_47_38-15181593128464986813?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_56_33-2541855699182495028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_39_06-3648533592739667159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_48_06-9363331108640413631?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_22_56_17-17716056079386942250?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 0s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/vjjzfmczinqs6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #56

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/56/display/redirect?page=changes>

Changes:

[chuck.yang] Use Avro format for file loads to BigQuery

[github] Revert "[BEAM-6374] Emit PCollection metrics from GoSDK (#10942)"


------------------------------------------
[...truncated 43.10 KB...]
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_urns.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :sdks:java:harness:shadowJar

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Processing <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz>
Processing /home/jenkins/.cache/pip/wheels/50/24/4d/4580ca4a299f1ad6fd63443e6e584cb21e9a07988e4aa8daac/crcmod-1.7-cp27-cp27mu-linux_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/59/b1/91/f02e76c732915c4015ab4010f3015469866c1eb9b14058d8e7/dill-0.3.1.1-cp27-none-any.whl
Collecting fastavro<0.22,>=0.21.4
  Using cached fastavro-0.21.24-cp27-cp27mu-manylinux1_x86_64.whl (1.0 MB)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (1.27.2)
Processing /home/jenkins/.cache/pip/wheels/fe/a7/05/23e3699975fc20f8a30e00ac1e515ab8c61168e982abe4ce70/hdfs-2.5.8-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/6d/41/4b/2b369d6e2b7eaebcdd423516d3fb659c7658c16a2be8fd04ec/httplib2-0.12.0-cp27-none-any.whl
Collecting mock<3.0.0,>=1.0.1
  Using cached mock-2.0.0-py2.py3-none-any.whl (56 kB)
Collecting numpy<2,>=1.14.3
  Using cached numpy-1.16.6-cp27-cp27mu-manylinux1_x86_64.whl (17.0 MB)

> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:google-cloud-dataflow-java:worker:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:worker:classes

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest
Collecting pymongo<4.0.0,>=3.8.0
  Using cached pymongo-3.10.1-cp27-cp27mu-manylinux1_x86_64.whl (444 kB)
Processing /home/jenkins/.cache/pip/wheels/48/f7/87/b932f09c6335dbcf45d916937105a372ab14f353a9ca431d7d/oauth2client-3.0.0-cp27-none-any.whl
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.11.3)
Collecting pydot<2,>=1.2.0
  Using cached pydot-1.4.1-py2.py3-none-any.whl (19 kB)
Collecting python-dateutil<3,>=2.8.0
  Using cached python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting pytz>=2018.3
  Using cached pytz-2019.3-py2.py3-none-any.whl (509 kB)
Processing /home/jenkins/.cache/pip/wheels/28/a0/fc/a3d4892b81eedc9e027323c08e9890bee1ec2d35edd1c1ac96/avro-1.9.2-py2-none-any.whl
Collecting funcsigs<2,>=1.0.2
  Using cached funcsigs-1.0.2-py2.py3-none-any.whl (17 kB)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.3.0)
Processing /home/jenkins/.cache/pip/wheels/81/91/41/3272543c0b9c61da9c525f24ee35bae6fe8f60d4858c66805d/PyVCF-0.6.8-cp27-none-any.whl
Collecting pyarrow<0.16.0,>=0.15.1
  Using cached pyarrow-0.15.1-cp27-cp27mu-manylinux2010_x86_64.whl (17.5 MB)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.7.4.1)
Collecting typing-extensions<3.8.0,>=3.7.0
  Using cached typing_extensions-3.7.4.1-py2-none-any.whl (9.0 kB)
Collecting cachetools<4,>=3.1.0
  Using cached cachetools-3.1.1-py2.py3-none-any.whl (11 kB)
Collecting google-apitools<0.5.29,>=0.5.28
  Using cached google_apitools-0.5.28-py2-none-any.whl (134 kB)
Collecting google-cloud-datastore<1.8.0,>=1.7.1
  Using cached google_cloud_datastore-1.7.4-py2.py3-none-any.whl (82 kB)
Collecting google-cloud-pubsub<1.1.0,>=0.39.0
  Using cached google_cloud_pubsub-1.0.2-py2.py3-none-any.whl (118 kB)
Collecting google-cloud-bigquery<=1.24.0,>=1.6.0
  Using cached google_cloud_bigquery-1.24.0-py2.py3-none-any.whl (165 kB)
Collecting google-cloud-core<2,>=0.28.1
  Using cached google_cloud_core-1.3.0-py2.py3-none-any.whl (26 kB)
Collecting google-cloud-bigtable<1.1.0,>=0.31.1
  Using cached google_cloud_bigtable-1.0.0-py2.py3-none-any.whl (232 kB)
Collecting google-cloud-spanner<1.14.0,>=1.13.0
  Using cached google_cloud_spanner-1.13.0-py2.py3-none-any.whl (212 kB)
Collecting grpcio-gcp<1,>=0.2.2
  Using cached grpcio_gcp-0.2.2-py2.py3-none-any.whl (9.4 kB)
Collecting google-cloud-dlp<=0.13.0,>=0.12.0
  Using cached google_cloud_dlp-0.13.0-py2.py3-none-any.whl (151 kB)
Collecting google-cloud-language<2,>=1.3.0
  Using cached google_cloud_language-1.3.0-py2.py3-none-any.whl (83 kB)
Collecting google-cloud-videointelligence<1.14.0,>=1.8.0
  Using cached google_cloud_videointelligence-1.13.0-py2.py3-none-any.whl (177 kB)
Collecting google-cloud-vision<0.43.0,>=0.38.0
  Using cached google_cloud_vision-0.42.0-py2.py3-none-any.whl (435 kB)
Processing /home/jenkins/.cache/pip/wheels/09/61/a5/7e8f4442b3c3d406ee9eb6c06e1ecbe5625f62f8cb19c08f5b/googledatastore-7.0.2-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/bd/ce/33/8b769968db3761c42c7a91d8a0dbbafc50acfa0750866c8abd/proto_google_cloud_datastore_v1-0.90.4-cp27-none-any.whl
Collecting freezegun>=0.3.12
  Using cached freezegun-0.3.15-py2.py3-none-any.whl (14 kB)
Collecting nose>=1.3.7
  Using cached nose-1.3.7-py2-none-any.whl (154 kB)
Processing /home/jenkins/.cache/pip/wheels/c4/1f/cd/9250fbf2fcc179e28bb4f7ee26a4fc7525914469d83a4f0c09/nose_xunitmp-0.4.1-cp27-none-any.whl
Collecting pandas<0.25,>=0.23.4
  Using cached pandas-0.24.2-cp27-cp27mu-manylinux1_x86_64.whl (10.1 MB)
Collecting parameterized<0.8.0,>=0.7.1
  Using cached parameterized-0.7.1-py2.py3-none-any.whl (24 kB)
Processing /home/jenkins/.cache/pip/wheels/8f/88/5d/2d12b9e226ee11ce171a603275d8dd6546d93202466b7fe173/PyHamcrest-1.10.1-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/e4/76/4d/a95b8dd7b452b69e8ed4f68b69e1b55e12c9c9624dd962b191/PyYAML-5.3-cp27-cp27mu-linux_x86_64.whl
Collecting requests_mock<2.0,>=1.7
  Using cached requests_mock-1.7.0-py2.py3-none-any.whl (23 kB)
Collecting tenacity<6.0,>=5.0.2
  Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting pytest<5.0,>=4.4.0
  Using cached pytest-4.6.9-py2.py3-none-any.whl (231 kB)
Collecting pytest-xdist<2,>=1.29.0
  Using cached pytest_xdist-1.31.0-py2.py3-none-any.whl (36 kB)
Collecting pytest-timeout<2,>=1.3.3
  Using cached pytest_timeout-1.3.4-py2.py3-none-any.whl (10 kB)
Requirement already satisfied: six>=1.5.2 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.21.0.dev0) (1.14.0)
Requirement already satisfied: enum34>=1.0.4; python_version < "3.4" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from grpcio<2,>=1.12.1->apache-beam==2.21.0.dev0) (1.1.9)
Processing /home/jenkins/.cache/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e/docopt-0.6.2-py2.py3-none-any.whl
Collecting requests>=2.7.0
  Using cached requests-2.23.0-py2.py3-none-any.whl (58 kB)
Collecting pbr>=0.11
  Using cached pbr-5.4.4-py2.py3-none-any.whl (110 kB)
Collecting rsa>=3.1.4
  Using cached rsa-4.0-py2.py3-none-any.whl (38 kB)
Collecting pyasn1>=0.1.7
  Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting pyasn1-modules>=0.0.5
  Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: setuptools in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from protobuf<4,>=3.5.0.post1->apache-beam==2.21.0.dev0) (44.0.0)
Collecting pyparsing>=2.1.4
  Using cached pyparsing-2.4.6-py2.py3-none-any.whl (67 kB)
Collecting fasteners>=0.14
  Using cached fasteners-0.15-py2.py3-none-any.whl (23 kB)
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0
  Using cached google_api_core-1.16.0-py2.py3-none-any.whl (70 kB)
Processing /home/jenkins/.cache/pip/wheels/de/3a/83/77a1e18e1a8757186df834b86ce6800120ac9c79cd8ca4091b/grpc_google_iam_v1-0.12.3-cp27-none-any.whl
Collecting google-auth<2.0dev,>=1.9.0
  Using cached google_auth-1.11.2-py2.py3-none-any.whl (76 kB)
Collecting google-resumable-media<0.6dev,>=0.5.0
  Using cached google_resumable_media-0.5.0-py2.py3-none-any.whl (38 kB)
Processing /home/jenkins/.cache/pip/wheels/2c/f9/7f/6eb87e636072bf467e25348bbeb96849333e6a080dca78f706/googleapis_common_protos-1.51.0-cp27-none-any.whl
Collecting monotonic>=0.6; python_version == "2.7"
  Using cached monotonic-1.5-py2.py3-none-any.whl (5.3 kB)
Collecting atomicwrites>=1.0
  Using cached atomicwrites-1.3.0-py2.py3-none-any.whl (5.9 kB)
Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (0.13.1)
Collecting packaging
  Using cached packaging-20.3-py2.py3-none-any.whl (37 kB)
Collecting attrs>=17.4.0
  Using cached attrs-19.3.0-py2.py3-none-any.whl (39 kB)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (1.5.0)
Collecting wcwidth
  Using cached wcwidth-0.1.8-py2.py3-none-any.whl (17 kB)
Collecting more-itertools<6.0.0,>=4.0.0; python_version <= "2.7"
  Using cached more_itertools-5.0.0-py2-none-any.whl (52 kB)
Requirement already satisfied: pathlib2>=2.2.0; python_version < "3.6" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (2.3.5)
Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (1.8.1)
Collecting pytest-forked
  Using cached pytest_forked-1.1.3-py2.py3-none-any.whl (4.5 kB)
Collecting execnet>=1.1
  Using cached execnet-1.7.1-py2.py3-none-any.whl (39 kB)
Collecting certifi>=2017.4.17
  Using cached certifi-2019.11.28-py2.py3-none-any.whl (156 kB)
Collecting idna<3,>=2.5
  Using cached idna-2.9-py2.py3-none-any.whl (58 kB)
Collecting chardet<4,>=3.0.2
  Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Using cached urllib3-1.25.8-py2.py3-none-any.whl (125 kB)
Requirement already satisfied: contextlib2; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (1.2.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from pathlib2>=2.2.0; python_version < "3.6"->pytest<5.0,>=4.4.0->apache-beam==2.21.0.dev0) (1.10.0)
Collecting apipkg>=1.4
  Using cached apipkg-1.5-py2.py3-none-any.whl (4.9 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: filename=apache_beam-2.21.0.dev0-py2-none-any.whl size=1963447 sha256=c68d04ba5bbe43d1f967cc414ca225f984843b447e61a34d5e88e38b87d7099d
  Stored in directory: /home/jenkins/.cache/pip/wheels/ad/b9/f4/76cfa08e37c58977b75eacbec4ff8f3ac2a0a8e409dca0ed3a
Successfully built apache-beam
Installing collected packages: crcmod, dill, fastavro, docopt, certifi, idna, chardet, urllib3, requests, hdfs, httplib2, pbr, funcsigs, mock, numpy, pymongo, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, python-dateutil, pytz, avro, pyvcf, pyarrow, typing-extensions, cachetools, monotonic, fasteners, google-apitools, google-auth, googleapis-common-protos, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, google-cloud-spanner, grpcio-gcp, google-cloud-dlp, google-cloud-language, google-cloud-videointelligence, google-cloud-vision, proto-google-cloud-datastore-v1, googledatastore, freezegun, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, tenacity, atomicwrites, packaging, attrs, wcwidth, more-itertools, pytest, pytest-forked, apipkg, execnet, pytest-xdist, pytest-timeout, apache-beam

> Task :runners:google-cloud-dataflow-java:worker:shadowJar

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest
ERROR: Could not install packages due to an EnvironmentError: [Errno 28] No space left on device


> Task :sdks:python:test-suites:dataflow:py2:installGcpTest FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 21s
62 actionable tasks: 44 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/sk77nfknnycym

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #55

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/55/display/redirect>

Changes:


------------------------------------------
[...truncated 5.73 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input0ca51bf3-7d13-4b3e-b35b-0256a4267185"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input0ca51bf3-7d13-4b3e-b35b-0256a4267185", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output0ca51bf3-7d13-4b3e-b35b-0256a4267185", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-06T00:58:48.826702Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-05_16_58_47-7530378878731391639'
 location: u'us-central1'
 name: u'beamapp-jenkins-0306005831-244803'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-06T00:58:48.826702Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-05_16_58_47-7530378878731391639]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_16_58_47-7530378878731391639?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-05_16_58_47-7530378878731391639 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:47.407Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-05_16_58_47-7530378878731391639.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:47.407Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:47.407Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-05_16_58_47-7530378878731391639. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:50.734Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:51.783Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.402Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.443Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.531Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.576Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.610Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.645Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.671Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.718Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.749Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.784Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.828Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.857Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.894Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:58:52.922Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:59:23.378Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:59:23.411Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:59:23.441Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:59:25.287Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-06T00:59:48.324Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-05_16_58_47-7530378878731391639 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2264.465s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_16_58_44-2973365476614843467?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_07_47-6756626056221515835?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_17_21-9147948811875837720?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_16_58_47-7530378878731391639?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_07_20-5950288354417009302?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_16_29-7937971941460313662?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_16_58_46-4147135620143190167?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_08_43-2873161209096467915?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_16_58_48-6925158345784090982?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_08_18-8011154590373380725?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_16_51-5334950425036736934?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_16_58_43-14048324965540274196?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_07_37-9154193224316377675?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_17_33-527347239458852279?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_16_58_47-16063237701546311864?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_09_16-5912367384221311159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_17_57-13706497749308753913?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_16_58_47-7617089547070525288?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_08_41-4324619171720017109?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_18_09-11281359106196790442?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_28_06-11718573381316839292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_16_58_46-15171654104214235894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_08_45-16832936932573124535?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_17_18_10-8586792000454523536?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 12s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/tz6enoqnks6aa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #54

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/54/display/redirect?page=changes>

Changes:

[git] Remove optionality and add sensible defaults to PubsubIO builders.

[iemejia] [BEAM-9450] Update www.apache.org/dist/ links to downloads.apache.org

[iemejia] [BEAM-9450] Convert links available via https to use https


------------------------------------------
[...truncated 5.47 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb24cfa9c-d92c-457c-b436-5bd9bf614ffc"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb24cfa9c-d92c-457c-b436-5bd9bf614ffc", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputb24cfa9c-d92c-457c-b436-5bd9bf614ffc", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-05T22:41:28.158363Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-05_14_41_26-11484404249539900866'
 location: u'us-central1'
 name: u'beamapp-jenkins-0305224114-264341'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-05T22:41:28.158363Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-05_14_41_26-11484404249539900866]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_41_26-11484404249539900866?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-05_14_41_26-11484404249539900866 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:26.890Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-05_14_41_26-11484404249539900866.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:26.890Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-05_14_41_26-11484404249539900866. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:26.890Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:30.564Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:31.697Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.502Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.546Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.652Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.693Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.720Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.762Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.805Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.868Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.908Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.939Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:32.986Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:33.021Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:33.057Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:33.090Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:37.140Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:37.173Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:37.205Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:41:56.988Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T22:42:00.536Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-05_14_41_26-11484404249539900866 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2373.013s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_41_25-10616526701651518891?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_51_22-12662991971038094229?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_41_26-1839006936376394784?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_50_50-5510845308780253882?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_15_00_53-665798128617858695?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_15_11_02-6838660377796141656?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_41_26-2801604665776254843?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_50_28-11561849885244438188?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_15_00_10-6295712757998406890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_41_26-11484404249539900866?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_49_37-15386086856890396772?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_58_10-14955234003122060303?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_41_27-12030365967759331528?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_50_41-1238872222029442620?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_59_33-7938581097739818274?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_41_29-13413951223014261023?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_50_08-13715245784317627851?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_59_01-11395012240431590106?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_41_25-11849331119993880231?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_49_46-639105852567481468?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_58_44-13557556860533950805?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_41_25-7971838604941224241?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_14_51_08-16083407611949805681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_15_00_39-13956782793656666460?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 21m 49s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/w4diypdi5zegi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #53

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/53/display/redirect?page=changes>

Changes:

[jkai] [Hotfix] fix rabbitmp spotless check


------------------------------------------
[...truncated 5.72 MB...]
    ]
  }, 
  "name": "beamapp-jenkins-0305210540-152194", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2a15cd84-9ded-4b7e-a87f-db91a289f08b"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2a15cd84-9ded-4b7e-a87f-db91a289f08b", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output2a15cd84-9ded-4b7e-a87f-db91a289f08b", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-05T21:06:05.711821Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-05_13_06_04-17152559469082580141'
 location: u'us-central1'
 name: u'beamapp-jenkins-0305210540-152194'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-05T21:06:05.711821Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-05_13_06_04-17152559469082580141]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_06_04-17152559469082580141?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-05_13_06_04-17152559469082580141 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:04.351Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-05_13_06_04-17152559469082580141. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:04.351Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:04.352Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-05_13_06_04-17152559469082580141.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:17.620Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:18.730Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.313Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.351Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.424Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.464Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.499Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.535Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.569Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.629Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.663Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.695Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.737Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.765Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.802Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:19.836Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T21:06:48.109Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-05_13_06_04-17152559469082580141 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2485.930s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_05_55-10504374680886559449?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_17_40-13446998942787818262?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_27_39-1474103475132273090?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_37_17-10594548991485106850?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_05_54-347915595060990927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_15_16-8001196851419569487?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_24_51-16522249982414579039?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_06_04-17152559469082580141?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_15_05-7394278527292491402?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_24_25-525956659683570714?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_05_55-7507168249544330302?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_15_45-10292904921117018553?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_05_56-4045446239458401982?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_15_38-328293722263261777?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_24_57-9916737317717187432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_05_54-9677646654970471565?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_16_16-16402777824286735158?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_26_19-16699590658069812299?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_05_58-6245724989939732862?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_16_28-17935803088129694419?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_25_19-15711298349416735440?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_05_55-10380492822731544257?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_15_02-12257475186617321285?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_13_24_57-1742113352428570813?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 23m 40s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/qd26infdwkhnc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #52

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/52/display/redirect?page=changes>

Changes:

[amaliujia] [BEAM-9288] Not bundle conscrypt in gRPC vendor in META-INF/

[lcwik] [BEAM-9452] Update classgraph to latest version to resolve windows


------------------------------------------
[...truncated 5.49 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input72859760-d71d-4df5-bd32-1d365d5b0cde"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input72859760-d71d-4df5-bd32-1d365d5b0cde", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output72859760-d71d-4df5-bd32-1d365d5b0cde", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-05T19:35:55.948316Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-05_11_35_54-13224572022000084887'
 location: u'us-central1'
 name: u'beamapp-jenkins-0305193537-324270'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-05T19:35:55.948316Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-05_11_35_54-13224572022000084887]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_35_54-13224572022000084887?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-05_11_35_54-13224572022000084887 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:35:54.815Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-05_11_35_54-13224572022000084887. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:35:54.815Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:35:54.815Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-05_11_35_54-13224572022000084887.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:08.428Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:09.323Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:09.943Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:09.975Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.050Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.095Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.121Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.153Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.185Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.249Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.275Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.301Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.345Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.383Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.420Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:10.460Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:13.936Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:13.961Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:13.994Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:34.956Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:36:39.882Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:37:12.256Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T19:37:12.297Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-05_11_35_54-13224572022000084887 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2396.718s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_35_51-17264410009600244257?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_46_09-16890902335204101913?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_55_49-1938172555221045348?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_35_54-13224572022000084887?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_44_12-9328499625968921149?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_54_27-4786309850571408619?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_35_52-174816657644966247?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_45_46-4436039928687048870?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_55_29-17484855873866954858?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_35_52-13554983060481355781?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_45_12-10170208185222672849?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_54_54-3206999437373666855?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_35_52-440717490198733100?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_45_07-4786828690848827086?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_55_25-14030263210380395366?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_35_53-12676572101332471353?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_46_00-5113744430029936107?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_35_55-760577544102969783?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_44_14-17592163574334448167?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_54_48-9682318841438186652?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_35_51-13441330518289121917?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_45_28-11586127962506030124?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_11_55_56-9878416609523354644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_12_06_17-14351553738711510830?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 24m 42s
64 actionable tasks: 53 executed, 11 from cache

Publishing build scan...
https://gradle.com/s/ion36lmuszvm4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #51

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/51/display/redirect>

Changes:


------------------------------------------
[...truncated 25.41 KB...]
Processing /home/jenkins/.cache/pip/wheels/45/c0/60/00682619a42655ca6ccd0c56343441f46e41a886476a34b7eb/mypy_protobuf-1.18-py2-none-any.whl
Collecting six<2,>=1.0.0
  Using cached six-1.14.0-py2.py3-none-any.whl (10 kB)
Processing /home/jenkins/.cache/pip/wheels/66/13/60/ef107438d90e4aad6320e3424e50cfce5e16d1e9aad6d38294/filelock-3.0.12-cp27-none-any.whl
Collecting virtualenv>=14.0.0
  Using cached virtualenv-20.0.8-py2.py3-none-any.whl (4.6 MB)
Collecting pluggy<1,>=0.3.0
  Using cached pluggy-0.13.1-py2.py3-none-any.whl (18 kB)
Collecting py<2,>=1.4.17
  Using cached py-1.8.1-py2.py3-none-any.whl (83 kB)
Collecting toml>=0.9.4
  Using cached toml-0.10.0-py2.py3-none-any.whl (25 kB)
Requirement already satisfied, skipping upgrade: setuptools>=30.0.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from tox==3.11.1) (44.0.0)
Collecting grpcio>=1.14.2
  Using cached grpcio-1.27.2-cp27-cp27mu-manylinux2010_x86_64.whl (2.6 MB)
Collecting protobuf>=3.5.0.post1
  Using cached protobuf-3.11.3-cp27-cp27mu-manylinux1_x86_64.whl (1.3 MB)
Processing /home/jenkins/.cache/pip/wheels/0c/88/ac/41500883ea902d3409a83a827870a726346b5ebfd0523e91df/distlib-0.3.0-py2-none-any.whl
Collecting importlib-resources<2,>=1.0; python_version < "3.7"
  Using cached importlib_resources-1.2.0-py2.py3-none-any.whl (31 kB)
Collecting pathlib2<3,>=2.3.3; python_version < "3.4" and sys_platform != "win32"
  Using cached pathlib2-2.3.5-py2.py3-none-any.whl (18 kB)
Collecting contextlib2<1,>=0.6.0; python_version < "3.3"
  Using cached contextlib2-0.6.0.post1-py2.py3-none-any.whl (9.8 kB)
Collecting appdirs<2,>=1.4.3
  Using cached appdirs-1.4.3-py2.py3-none-any.whl (12 kB)
Collecting importlib-metadata<2,>=0.12; python_version < "3.8"
  Using cached importlib_metadata-1.5.0-py2.py3-none-any.whl (30 kB)
Collecting enum34>=1.0.4; python_version < "3.4"
  Using cached enum34-1.1.9-py2-none-any.whl (11 kB)
Collecting futures>=2.2.0; python_version < "3.2"
  Using cached futures-3.3.0-py2-none-any.whl (16 kB)
Collecting singledispatch; python_version < "3.4"
  Using cached singledispatch-3.4.0.3-py2.py3-none-any.whl (12 kB)
Collecting zipp>=0.4; python_version < "3.8"
  Using cached zipp-1.2.0-py2.py3-none-any.whl (4.8 kB)
Collecting typing; python_version < "3.5"
  Using cached typing-3.7.4.1-py2-none-any.whl (26 kB)
Processing /home/jenkins/.cache/pip/wheels/91/95/75/19c98a91239878abbc7c59970abd3b4e0438a7dd5b61778335/scandir-1.10.0-cp27-cp27mu-linux_x86_64.whl
Collecting configparser>=3.5; python_version < "3"
  Using cached configparser-4.0.2-py2.py3-none-any.whl (22 kB)
Installing collected packages: six, filelock, distlib, singledispatch, scandir, pathlib2, contextlib2, zipp, configparser, importlib-metadata, typing, importlib-resources, appdirs, virtualenv, pluggy, py, toml, tox, enum34, futures, grpcio, protobuf, grpcio-tools, future, mypy-protobuf

> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :vendor:sdks-java-extensions-protobuf:compileJava FROM-CACHE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :vendor:sdks-java-extensions-protobuf:shadowJar
> Task :runners:core-construction-java:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :runners:core-java:jar
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:harness:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar

> Task :sdks:python:test-suites:dataflow:py2:setupVirtualenv
ERROR: Could not install packages due to an EnvironmentError: [Errno 28] No space left on device


> Task :sdks:python:test-suites:dataflow:py2:setupVirtualenv FAILED

> Task :sdks:python:setupVirtualenv
Successfully installed appdirs-1.4.3 configparser-4.0.2 contextlib2-0.6.0.post1 distlib-0.3.0 enum34-1.1.9 filelock-3.0.12 future-0.16.0 futures-3.3.0 grpcio-1.27.2 grpcio-tools-1.14.2 importlib-metadata-1.5.0 importlib-resources-1.2.0 mypy-protobuf-1.18 pathlib2-2.3.5 pluggy-0.13.1 protobuf-3.11.3 py-1.8.1 scandir-1.10.0 singledispatch-3.4.0.3 six-1.14.0 toml-0.10.0 tox-3.11.1 typing-3.7.4.1 virtualenv-20.0.8 zipp-1.2.0

> Task :sdks:java:harness:shadowJar

> Task :sdks:python:sdist
setup.py:253: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.21.0.dev' to '2.21.0.dev0'
  normalized_version,
INFO:gen_protos:Regenerating Python proto definitions (no output files).
INFO:gen_protos:Found protoc_gen_mypy at <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/1922375555/bin/protoc-gen-mypy>
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
beam_interactive_api.proto: warning: Import google/protobuf/timestamp.proto but not used.
Writing mypy to endpoints_pb2.pyi
Writing mypy to external_transforms_pb2.pyi
Writing mypy to beam_provision_api_pb2.pyi
Writing mypy to beam_runner_api_pb2.pyi
Writing mypy to standard_window_fns_pb2.pyi
Writing mypy to beam_artifact_api_pb2.pyi
Writing mypy to beam_fn_api_pb2.pyi
Writing mypy to metrics_pb2.pyi
Writing mypy to schema_pb2.pyi
Writing mypy to beam_job_api_pb2.pyi
Writing mypy to beam_interactive_api_pb2.pyi
Writing mypy to beam_expansion_api_pb2.pyi
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>

> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:google-cloud-dataflow-java:worker:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:worker:classes

> Task :sdks:python:sdist
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_urns.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :runners:google-cloud-dataflow-java:worker:shadowJar

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:setupVirtualenv'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 1s
61 actionable tasks: 43 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/klwdqgydib4j2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #50

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/50/display/redirect?page=changes>

Changes:

[hktang] [BEAM-9453] Changed new string creation to use StandardCharsets.UTF_8


------------------------------------------
[...truncated 36.27 KB...]
  Using cached singledispatch-3.4.0.3-py2.py3-none-any.whl (12 kB)
Collecting zipp>=0.4; python_version < "3.8"
  Using cached zipp-1.2.0-py2.py3-none-any.whl (4.8 kB)
Collecting typing; python_version < "3.5"
  Using cached typing-3.7.4.1-py2-none-any.whl (26 kB)
Processing /home/jenkins/.cache/pip/wheels/91/95/75/19c98a91239878abbc7c59970abd3b4e0438a7dd5b61778335/scandir-1.10.0-cp27-cp27mu-linux_x86_64.whl
Collecting configparser>=3.5; python_version < "3"
  Using cached configparser-4.0.2-py2.py3-none-any.whl (22 kB)
Installing collected packages: six, filelock, distlib, singledispatch, scandir, pathlib2, contextlib2, zipp, configparser, importlib-metadata, typing, importlib-resources, appdirs, virtualenv, pluggy, py, toml, tox, enum34, futures, grpcio, protobuf, grpcio-tools, future, mypy-protobuf
Successfully installed appdirs-1.4.3 configparser-4.0.2 contextlib2-0.6.0.post1 distlib-0.3.0 enum34-1.1.9 filelock-3.0.12 future-0.16.0 futures-3.3.0 grpcio-1.27.2 grpcio-tools-1.14.2 importlib-metadata-1.5.0 importlib-resources-1.2.0 mypy-protobuf-1.18 pathlib2-2.3.5 pluggy-0.13.1 protobuf-3.11.3 py-1.8.1 scandir-1.10.0 singledispatch-3.4.0.3 six-1.14.0 toml-0.10.0 tox-3.11.1 typing-3.7.4.1 virtualenv-20.0.8 zipp-1.2.0

> Task :sdks:python:sdist
setup.py:253: UserWarning: You are using Apache Beam with Python 2. New releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/1922375555/local/lib/python2.7/site-packages/setuptools/dist.py>:476: UserWarning: Normalizing '2.21.0.dev' to '2.21.0.dev0'
  normalized_version,
INFO:gen_protos:Regenerating Python proto definitions (no output files).
INFO:gen_protos:Found protoc_gen_mypy at <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/1922375555/bin/protoc-gen-mypy>
beam_fn_api.proto: warning: Import google/protobuf/descriptor.proto but not used.
beam_fn_api.proto: warning: Import google/protobuf/wrappers.proto but not used.
beam_interactive_api.proto: warning: Import google/protobuf/timestamp.proto but not used.
Writing mypy to endpoints_pb2.pyi
Writing mypy to external_transforms_pb2.pyi
Writing mypy to beam_provision_api_pb2.pyi
Writing mypy to beam_runner_api_pb2.pyi
Writing mypy to standard_window_fns_pb2.pyi
Writing mypy to beam_artifact_api_pb2.pyi
Writing mypy to beam_fn_api_pb2.pyi
Writing mypy to metrics_pb2.pyi
Writing mypy to schema_pb2.pyi
Writing mypy to beam_job_api_pb2.pyi
Writing mypy to beam_interactive_api_pb2.pyi
Writing mypy to beam_expansion_api_pb2.pyi
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: idioms
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Skipping optional fixer: ws_comma
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>

> Task :sdks:java:harness:shadowJar

> Task :sdks:python:sdist
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: No changes to <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: Refactored <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
RefactoringTool: Files that were modified:
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_expansion_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_interactive_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_provision_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_grpc.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/endpoints_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/external_transforms_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/schema_pb2.py>
RefactoringTool: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/metrics_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_artifact_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/standard_window_fns_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_fn_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_job_api_pb2_urns.py>
INFO:gen_protos:Writing urn stubs: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/portability/api/beam_runner_api_pb2_urns.py>
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md


> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :runners:google-cloud-dataflow-java:worker:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:worker:classes

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Processing <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz>
Processing /home/jenkins/.cache/pip/wheels/50/24/4d/4580ca4a299f1ad6fd63443e6e584cb21e9a07988e4aa8daac/crcmod-1.7-cp27-cp27mu-linux_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/59/b1/91/f02e76c732915c4015ab4010f3015469866c1eb9b14058d8e7/dill-0.3.1.1-cp27-none-any.whl
Collecting fastavro<0.22,>=0.21.4
  Using cached fastavro-0.21.24-cp27-cp27mu-manylinux1_x86_64.whl (1.0 MB)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (1.27.2)
Processing /home/jenkins/.cache/pip/wheels/fe/a7/05/23e3699975fc20f8a30e00ac1e515ab8c61168e982abe4ce70/hdfs-2.5.8-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/6d/41/4b/2b369d6e2b7eaebcdd423516d3fb659c7658c16a2be8fd04ec/httplib2-0.12.0-cp27-none-any.whl
Collecting mock<3.0.0,>=1.0.1
  Using cached mock-2.0.0-py2.py3-none-any.whl (56 kB)
Collecting numpy<2,>=1.14.3
  Using cached numpy-1.16.6-cp27-cp27mu-manylinux1_x86_64.whl (17.0 MB)
Collecting pymongo<4.0.0,>=3.8.0
  Using cached pymongo-3.10.1-cp27-cp27mu-manylinux1_x86_64.whl (444 kB)
Processing /home/jenkins/.cache/pip/wheels/48/f7/87/b932f09c6335dbcf45d916937105a372ab14f353a9ca431d7d/oauth2client-3.0.0-cp27-none-any.whl
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.11.3)
Collecting pydot<2,>=1.2.0
  Using cached pydot-1.4.1-py2.py3-none-any.whl (19 kB)
Collecting python-dateutil<3,>=2.8.0
  Using cached python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting pytz>=2018.3
  Using cached pytz-2019.3-py2.py3-none-any.whl (509 kB)
Processing /home/jenkins/.cache/pip/wheels/28/a0/fc/a3d4892b81eedc9e027323c08e9890bee1ec2d35edd1c1ac96/avro-1.9.2-py2-none-any.whl
Collecting funcsigs<2,>=1.0.2
  Using cached funcsigs-1.0.2-py2.py3-none-any.whl (17 kB)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.3.0)
Processing /home/jenkins/.cache/pip/wheels/81/91/41/3272543c0b9c61da9c525f24ee35bae6fe8f60d4858c66805d/PyVCF-0.6.8-cp27-none-any.whl
Collecting pyarrow<0.16.0,>=0.15.1
  Using cached pyarrow-0.15.1-cp27-cp27mu-manylinux2010_x86_64.whl (17.5 MB)
Requirement already satisfied: typing<3.8.0,>=3.7.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.7.4.1)
Collecting typing-extensions<3.8.0,>=3.7.0
  Using cached typing_extensions-3.7.4.1-py2-none-any.whl (9.0 kB)
Collecting cachetools<4,>=3.1.0
  Using cached cachetools-3.1.1-py2.py3-none-any.whl (11 kB)
Collecting google-apitools<0.5.29,>=0.5.28
  Using cached google_apitools-0.5.28-py2-none-any.whl (134 kB)
Collecting google-cloud-datastore<1.8.0,>=1.7.1
  Using cached google_cloud_datastore-1.7.4-py2.py3-none-any.whl (82 kB)
Collecting google-cloud-pubsub<1.1.0,>=0.39.0
  Using cached google_cloud_pubsub-1.0.2-py2.py3-none-any.whl (118 kB)
Collecting google-cloud-bigquery<=1.24.0,>=1.6.0
  Using cached google_cloud_bigquery-1.24.0-py2.py3-none-any.whl (165 kB)
Collecting google-cloud-core<2,>=0.28.1
  Using cached google_cloud_core-1.3.0-py2.py3-none-any.whl (26 kB)
Collecting google-cloud-bigtable<1.1.0,>=0.31.1
  Using cached google_cloud_bigtable-1.0.0-py2.py3-none-any.whl (232 kB)
Collecting google-cloud-spanner<1.14.0,>=1.13.0
  Using cached google_cloud_spanner-1.13.0-py2.py3-none-any.whl (212 kB)
Collecting grpcio-gcp<1,>=0.2.2
  Using cached grpcio_gcp-0.2.2-py2.py3-none-any.whl (9.4 kB)
Collecting google-cloud-dlp<=0.13.0,>=0.12.0
  Using cached google_cloud_dlp-0.13.0-py2.py3-none-any.whl (151 kB)
Collecting google-cloud-language<2,>=1.3.0
  Using cached google_cloud_language-1.3.0-py2.py3-none-any.whl (83 kB)
Collecting google-cloud-videointelligence<1.14.0,>=1.8.0
  Using cached google_cloud_videointelligence-1.13.0-py2.py3-none-any.whl (177 kB)
Collecting google-cloud-vision<0.43.0,>=0.38.0
  Using cached google_cloud_vision-0.42.0-py2.py3-none-any.whl (435 kB)
Processing /home/jenkins/.cache/pip/wheels/09/61/a5/7e8f4442b3c3d406ee9eb6c06e1ecbe5625f62f8cb19c08f5b/googledatastore-7.0.2-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/bd/ce/33/8b769968db3761c42c7a91d8a0dbbafc50acfa0750866c8abd/proto_google_cloud_datastore_v1-0.90.4-cp27-none-any.whl
Collecting freezegun>=0.3.12
  Using cached freezegun-0.3.15-py2.py3-none-any.whl (14 kB)
Collecting nose>=1.3.7
  Using cached nose-1.3.7-py2-none-any.whl (154 kB)
Processing /home/jenkins/.cache/pip/wheels/c4/1f/cd/9250fbf2fcc179e28bb4f7ee26a4fc7525914469d83a4f0c09/nose_xunitmp-0.4.1-cp27-none-any.whl
Collecting pandas<0.25,>=0.23.4
  Using cached pandas-0.24.2-cp27-cp27mu-manylinux1_x86_64.whl (10.1 MB)
ERROR: Could not install packages due to an EnvironmentError: [Errno 28] No space left on device


> Task :sdks:python:test-suites:dataflow:py2:installGcpTest FAILED
> Task :runners:google-cloud-dataflow-java:worker:shadowJar

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 7s
62 actionable tasks: 44 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/qrmfr6wfvtoxg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #49

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/49/display/redirect>

Changes:


------------------------------------------
[...truncated 5.46 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2ac5847f-70f0-4971-a4ce-102262998eba"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2ac5847f-70f0-4971-a4ce-102262998eba", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output2ac5847f-70f0-4971-a4ce-102262998eba", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-05T13:49:34.458288Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-05_05_49_32-8765789369096770966'
 location: u'us-central1'
 name: u'beamapp-jenkins-0305134916-590925'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-05T13:49:34.458288Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-05_05_49_32-8765789369096770966]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_49_32-8765789369096770966?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-05_05_49_32-8765789369096770966 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:33Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:33Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-05_05_49_32-8765789369096770966. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:33Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-05_05_49_32-8765789369096770966.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:38.037Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:38.739Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.289Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.367Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.441Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.478Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.508Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.548Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.581Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.648Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.685Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.722Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.771Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.797Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.833Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:39.875Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:46.498Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:46.540Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:49:46.592Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:50:07.538Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T13:50:12.934Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-05_05_49_32-8765789369096770966 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2398.507s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_49_35-7026707032130560411?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_06_00_18-2614820146354792455?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_49_32-8765789369096770966?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_57_29-6883723233163475224?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_06_06_18-6397059738431151592?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_49_35-3593952415065702316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_58_06-12666462611399105597?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_06_07_56-13888426797668252300?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_49_34-5840583468414571421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_59_09-4516435574888603171?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_06_08_27-8345001373471030128?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_49_33-6073208516665787463?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_59_12-9084288894316967714?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_06_09_14-9740561827106134760?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_49_35-9824927616260375444?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_58_01-10091088635132882087?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_06_08_21-9844009507826052701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_49_34-15870485544539621883?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_58_26-10754493018744950276?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_06_09_51-15344765778168138905?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_06_19_20-14724409510510393863?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_49_35-18224108542149042221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_05_59_11-11189211800107940028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_06_08_51-2841216599859052316?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 22m 12s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/kbp6yyshcd44s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #48

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/48/display/redirect?page=changes>

Changes:

[github] Add integration test for AnnotateVideoWithContext transform (#10986)


------------------------------------------
[...truncated 5.80 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input56771065-6e2c-4b60-aa0f-9f7153057713"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input56771065-6e2c-4b60-aa0f-9f7153057713", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output56771065-6e2c-4b60-aa0f-9f7153057713", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-05T12:17:42.986416Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-05_04_17_41-5194898803500874049'
 location: u'us-central1'
 name: u'beamapp-jenkins-0305121725-731248'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-05T12:17:42.986416Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-05_04_17_41-5194898803500874049]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_17_41-5194898803500874049?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-05_04_17_41-5194898803500874049 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:41.533Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:41.533Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-05_04_17_41-5194898803500874049.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:41.533Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-05_04_17_41-5194898803500874049. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:46.486Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:47.807Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:48.514Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:48.546Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:48.618Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:48.805Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:48.976Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:49.037Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:49.071Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:49.131Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:49.163Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:49.188Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:49.230Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:49.256Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:49.293Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:49.328Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:51.798Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:51.869Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:17:51.935Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:18:10.614Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:18:15.687Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:18:53.442Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T12:18:53.479Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-05_04_17_41-5194898803500874049 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2206.213s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_17_44-7253911600487458447?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_27_03-15306751893065411432?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_36_31-6435982963711470704?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_45_07-6274032832470182057?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_17_40-6767978401116000060?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_26_20-16083188239718393416?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_35_19-17040372291699686387?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_17_41-5194898803500874049?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_25_09-444390690074889181?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_35_15-13137881550704568015?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_17_43-13701818600136396193?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_25_59-8763507196137129918?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_35_52-8188940740492690215?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_17_39-12043709366257286248?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_26_11-16773616963084472045?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_35_44-13165810392912116971?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_17_43-12817133207700783949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_26_22-5826542975479925732?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_35_32-8457799961688787769?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_17_44-8644685392182382043?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_26_53-11950570510995946909?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_17_46-16088645353420497936?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_27_08-14364197267963467251?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_04_35_56-16124209418648309214?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 8s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/j3oadfv77ks3i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #47

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/47/display/redirect?page=changes>

Changes:

[github] Update lostluck's info on the Go SDK roadmap

[github] Switch contact email to apache.org.


------------------------------------------
[...truncated 5.78 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputccbd4f5b-9ecb-4d1d-97a8-01f85991cc40"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputccbd4f5b-9ecb-4d1d-97a8-01f85991cc40", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputccbd4f5b-9ecb-4d1d-97a8-01f85991cc40", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-05T10:27:02.775253Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-05_02_27_00-8484980571011377973'
 location: u'us-central1'
 name: u'beamapp-jenkins-0305102645-682616'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-05T10:27:02.775253Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-05_02_27_00-8484980571011377973]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_27_00-8484980571011377973?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-05_02_27_00-8484980571011377973 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:01.016Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:01.016Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-05_02_27_00-8484980571011377973.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:01.016Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-05_02_27_00-8484980571011377973. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:12.677Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:13.828Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.398Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.424Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.474Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.506Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.528Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.557Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.580Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.625Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.657Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.685Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.717Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.743Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.770Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:14.799Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:25.623Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:25.645Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:25.674Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:35.960Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T10:27:56.359Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-05_02_27_00-8484980571011377973 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2333.698s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_26_59-528409833864796460?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_35_58-11431146601121833004?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_47_30-3844162878420208356?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_57_09-12722124964221302810?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_26_59-12722418712667114221?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_35_58-9250942523363227979?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_46_30-16300550817172846977?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_27_00-8484980571011377973?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_35_23-16050668199162184334?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_44_57-13794152189435241833?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_27_01-14041831817807402743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_35_54-8050760470232563441?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_46_18-10387410150587506693?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_27_00-16335810226521878198?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_35_23-12062962878129918501?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_44_16-18321103323354090568?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_27_01-7435157316476452568?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_36_45-1809763389596155797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_46_25-18269804759984510863?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_27_00-16197298561546632752?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_37_36-9935627106159409368?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_27_00-6543836786199881991?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_37_06-14048056541629589586?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-05_02_45_36-2288072184302009247?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 22m 2s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/bzqtxsrqroejk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #46

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/46/display/redirect?page=changes>

Changes:

[github] [BEAM-8328] Disable community metrics integration test in 'test' task


------------------------------------------
[...truncated 52.86 KB...]

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest
DEPRECATION: Python 2.7 reached the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 is no longer maintained. A future version of pip will drop support for Python 2.7. More details about Python 2 support in pip, can be found at https://pip.pypa.io/en/latest/development/release-process/#python-2-support
Processing <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz>
Processing /home/jenkins/.cache/pip/wheels/50/24/4d/4580ca4a299f1ad6fd63443e6e584cb21e9a07988e4aa8daac/crcmod-1.7-cp27-cp27mu-linux_x86_64.whl
Processing /home/jenkins/.cache/pip/wheels/59/b1/91/f02e76c732915c4015ab4010f3015469866c1eb9b14058d8e7/dill-0.3.1.1-cp27-none-any.whl
Collecting fastavro<0.22,>=0.21.4
  Using cached fastavro-0.21.24-cp27-cp27mu-manylinux1_x86_64.whl (1.0 MB)
Requirement already satisfied: future<1.0.0,>=0.16.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (0.16.0)
Requirement already satisfied: grpcio<2,>=1.12.1 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (1.27.2)
Processing /home/jenkins/.cache/pip/wheels/fe/a7/05/23e3699975fc20f8a30e00ac1e515ab8c61168e982abe4ce70/hdfs-2.5.8-cp27-none-any.whl
Processing /home/jenkins/.cache/pip/wheels/6d/41/4b/2b369d6e2b7eaebcdd423516d3fb659c7658c16a2be8fd04ec/httplib2-0.12.0-cp27-none-any.whl
Collecting mock<3.0.0,>=1.0.1
  Using cached mock-2.0.0-py2.py3-none-any.whl (56 kB)
Collecting numpy<2,>=1.14.3
  Using cached numpy-1.16.6-cp27-cp27mu-manylinux1_x86_64.whl (17.0 MB)
Collecting pymongo<4.0.0,>=3.8.0
  Using cached pymongo-3.10.1-cp27-cp27mu-manylinux1_x86_64.whl (444 kB)
Processing /home/jenkins/.cache/pip/wheels/48/f7/87/b932f09c6335dbcf45d916937105a372ab14f353a9ca431d7d/oauth2client-3.0.0-cp27-none-any.whl
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.11.3)
Collecting pydot<2,>=1.2.0
  Using cached pydot-1.4.1-py2.py3-none-any.whl (19 kB)
Collecting python-dateutil<3,>=2.8.0
  Using cached python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
Collecting pytz>=2018.3
  Using cached pytz-2019.3-py2.py3-none-any.whl (509 kB)
Processing /home/jenkins/.cache/pip/wheels/28/a0/fc/a3d4892b81eedc9e027323c08e9890bee1ec2d35edd1c1ac96/avro-1.9.2-py2-none-any.whl
Collecting funcsigs<2,>=1.0.2
  Using cached funcsigs-1.0.2-py2.py3-none-any.whl (17 kB)
Requirement already satisfied: futures<4.0.0,>=3.2.0 in <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages> (from apache-beam==2.21.0.dev0) (3.3.0)
Processing /home/jenkins/.cache/pip/wheels/81/91/41/3272543c0b9c61da9c525f24ee35bae6fe8f60d4858c66805d/PyVCF-0.6.8-cp27-none-any.whl
Collecting pyarrow<0.16.0,>=0.15.1
  Using cached pyarrow-0.15.1-cp27-cp27mu-manylinux2010_x86_64.whl (17.5 MB)
ERROR: Could not install packages due to an EnvironmentError: [Errno 28] No space left on device


> Task :sdks:python:test-suites:dataflow:py2:installGcpTest FAILED

> Task :runners:google-cloud-dataflow-java:worker:shadowJar
ex
java.io.IOException: No space left on device
	at java.io.RandomAccessFile.writeBytes(Native Method)
	at java.io.RandomAccessFile.write(RandomAccessFile.java:525)
	at shadow.org.apache.tools.zip.ZipOutputStream.writeOut(ZipOutputStream.java:1472)
	at shadow.org.apache.tools.zip.ZipOutputStream.writeCounted(ZipOutputStream.java:898)
	at shadow.org.apache.tools.zip.ZipOutputStream.deflate(ZipOutputStream.java:1003)
	at shadow.org.apache.tools.zip.ZipOutputStream.deflateUntilInputIsNeeded(ZipOutputStream.java:1493)
	at shadow.org.apache.tools.zip.ZipOutputStream.writeDeflated(ZipOutputStream.java:911)
	at shadow.org.apache.tools.zip.ZipOutputStream.write(ZipOutputStream.java:881)
	at shadow.org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:2315)
	at shadow.org.apache.commons.io.IOUtils.copy(IOUtils.java:2270)
	at shadow.org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:2291)
	at shadow.org.apache.commons.io.IOUtils$copyLarge.call(Unknown Source)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction.copyArchiveEntry(ShadowCopyAction.groovy:382)
	at sun.reflect.GeneratedMethodAccessor451.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:190)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:58)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:176)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction.visitArchiveFile(ShadowCopyAction.groovy:286)
	at sun.reflect.GeneratedMethodAccessor448.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:104)
	at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:326)
	at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:352)
	at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:68)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:176)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction$_processArchive_closure3.doCall(ShadowCopyAction.groovy:265)
	at sun.reflect.GeneratedMethodAccessor438.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:104)
	at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:326)
	at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:264)
	at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1041)
	at groovy.lang.Closure.call(Closure.java:411)
	at groovy.lang.Closure.call(Closure.java:427)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2296)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2281)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2322)
	at org.codehaus.groovy.runtime.dgm$186.invoke(Unknown Source)
	at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:246)
	at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:55)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:127)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction.processArchive(ShadowCopyAction.groovy:263)
	at sun.reflect.GeneratedMethodAccessor487.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:190)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:58)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:168)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction.visitFile(ShadowCopyAction.groovy:248)
	at sun.reflect.GeneratedMethodAccessor399.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:190)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:58)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:168)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$BaseStreamAction.processFile(ShadowCopyAction.groovy:183)
	at org.gradle.api.internal.file.copy.NormalizingCopyActionDecorator$1$1.processFile(NormalizingCopyActionDecorator.java:66)
	at org.gradle.api.internal.file.copy.DuplicateHandlingCopyActionDecorator$1$1.processFile(DuplicateHandlingCopyActionDecorator.java:60)
	at org.gradle.api.internal.file.copy.CopyFileVisitorImpl.processFile(CopyFileVisitorImpl.java:62)
	at org.gradle.api.internal.file.copy.CopyFileVisitorImpl.visitFile(CopyFileVisitorImpl.java:46)
	at org.gradle.api.internal.file.collections.AbstractSingletonFileTree.visit(AbstractSingletonFileTree.java:36)
	at org.gradle.api.internal.file.collections.FileTreeAdapter.visit(FileTreeAdapter.java:118)
	at org.gradle.api.internal.file.CompositeFileTree.visit(CompositeFileTree.java:93)
	at org.gradle.api.internal.file.copy.CopySpecActionImpl.execute(CopySpecActionImpl.java:39)
	at org.gradle.api.internal.file.copy.CopySpecActionImpl.execute(CopySpecActionImpl.java:24)
	at org.gradle.api.internal.file.copy.DefaultCopySpec$DefaultCopySpecResolver.walk(DefaultCopySpec.java:693)
	at org.gradle.api.internal.file.copy.DefaultCopySpec$DefaultCopySpecResolver.walk(DefaultCopySpec.java:695)
	at org.gradle.api.internal.file.copy.DefaultCopySpec.walk(DefaultCopySpec.java:499)
	at org.gradle.api.internal.file.copy.CopySpecBackedCopyActionProcessingStream.process(CopySpecBackedCopyActionProcessingStream.java:38)
	at org.gradle.api.internal.file.copy.DuplicateHandlingCopyActionDecorator$1.process(DuplicateHandlingCopyActionDecorator.java:44)
	at org.gradle.api.internal.file.copy.NormalizingCopyActionDecorator$1.process(NormalizingCopyActionDecorator.java:57)
	at org.gradle.api.internal.file.copy.CopyActionProcessingStream$process.call(Unknown Source)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$2.execute(ShadowCopyAction.groovy:110)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$2$execute.call(Unknown Source)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction.withResource(ShadowCopyAction.groovy:152)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:104)
	at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite$StaticMetaMethodSiteNoUnwrapNoCoerce.invoke(StaticMetaMethodSite.java:151)
	at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite.callStatic(StaticMetaMethodSite.java:102)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:216)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction.execute(ShadowCopyAction.groovy:107)
	at org.gradle.api.internal.file.copy.NormalizingCopyActionDecorator.execute(NormalizingCopyActionDecorator.java:53)
	at org.gradle.api.internal.file.copy.DuplicateHandlingCopyActionDecorator.execute(DuplicateHandlingCopyActionDecorator.java:42)
	at org.gradle.api.internal.file.copy.CopyActionExecuter.execute(CopyActionExecuter.java:40)
	at org.gradle.api.tasks.AbstractCopyTask.copy(AbstractCopyTask.java:179)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar.copy(ShadowJar.java:96)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:48)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:705)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:672)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$4.run(ExecuteActionsTaskExecuter.java:338)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:327)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:312)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:75)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:158)
	at org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:46)
	at org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49)
	at org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34)
	at org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:49)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:42)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:28)
	at org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:133)
	at org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$5(CacheStep.java:83)
	at java.util.Optional.orElseGet(Optional.java:267)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:37)
	at org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:95)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$1(SkipUpToDateStep.java:90)
	at java.util.Optional.orElseGet(Optional.java:267)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:90)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)
	at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
	at org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
	at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
	at java.lang.Thread.run(Thread.java:748)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 58s
62 actionable tasks: 44 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/jzdefjv7pn4am

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #45

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/45/display/redirect>

Changes:


------------------------------------------
[...truncated 164.29 KB...]
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/cy_combiners.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/core.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/window_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/create_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/cy_dataflow_distribution_counter.pyx -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/environments_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/util.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/timeutil.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/create_source.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/sideinputs.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/trigger_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/external_test_py37.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/environments.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/ptransform.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/cy_dataflow_distribution_counter.pxd -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/stats_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/validate_runner_xlang_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/__init__.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/userstate.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/combiners_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/display_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/display.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/window.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/write_ptransform_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/external_java.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/external_it_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/external_test_py3.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/transforms_keyword_only_args_test_py3.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  copying build/lib.linux-x86_64-2.7/apache_beam/transforms/external.py -> build/bdist.linux-x86_64/wheel/apache_beam/transforms
  creating build/bdist.linux-x86_64/wheel/apache_beam/options
  copying build/lib.linux-x86_64-2.7/apache_beam/options/value_provider_test.py -> build/bdist.linux-x86_64/wheel/apache_beam/options
  copying build/lib.linux-x86_64-2.7/apache_beam/options/pipeline_options.py -> build/bdist.linux-x86_64/wheel/apache_beam/options
  error: [Errno 28] No space left on device
  ----------------------------------------
  ERROR: Failed building wheel for apache-beam
Failed to build apache-beam

> Task :runners:google-cloud-dataflow-java:worker:shadowJar
ex
java.io.IOException: No space left on device
	at java.io.RandomAccessFile.writeBytes(Native Method)
	at java.io.RandomAccessFile.write(RandomAccessFile.java:525)
	at shadow.org.apache.tools.zip.ZipOutputStream.writeOut(ZipOutputStream.java:1472)
	at shadow.org.apache.tools.zip.ZipOutputStream.writeCounted(ZipOutputStream.java:898)
	at shadow.org.apache.tools.zip.ZipOutputStream.deflate(ZipOutputStream.java:1003)
	at shadow.org.apache.tools.zip.ZipOutputStream.flushDeflater(ZipOutputStream.java:578)
	at shadow.org.apache.tools.zip.ZipOutputStream.closeEntry(ZipOutputStream.java:535)
	at shadow.org.apache.tools.zip.ZipOutputStream$closeEntry$3.call(Unknown Source)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction.copyArchiveEntry(ShadowCopyAction.groovy:386)
	at sun.reflect.GeneratedMethodAccessor461.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:190)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:58)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:176)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction.visitArchiveFile(ShadowCopyAction.groovy:286)
	at sun.reflect.GeneratedMethodAccessor458.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:104)
	at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:326)
	at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:352)
	at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.callCurrent(PogoMetaClassSite.java:68)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:176)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction$_processArchive_closure3.doCall(ShadowCopyAction.groovy:265)
	at sun.reflect.GeneratedMethodAccessor448.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:104)
	at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:326)
	at org.codehaus.groovy.runtime.metaclass.ClosureMetaClass.invokeMethod(ClosureMetaClass.java:264)
	at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1041)
	at groovy.lang.Closure.call(Closure.java:411)
	at groovy.lang.Closure.call(Closure.java:427)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2296)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2281)
	at org.codehaus.groovy.runtime.DefaultGroovyMethods.each(DefaultGroovyMethods.java:2322)
	at org.codehaus.groovy.runtime.dgm$186.invoke(Unknown Source)
	at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:246)
	at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:55)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:127)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction.processArchive(ShadowCopyAction.groovy:263)
	at sun.reflect.GeneratedMethodAccessor497.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:190)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:58)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:168)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$StreamAction.visitFile(ShadowCopyAction.groovy:248)
	at sun.reflect.GeneratedMethodAccessor409.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:190)
	at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:58)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:168)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$BaseStreamAction.processFile(ShadowCopyAction.groovy:183)
	at org.gradle.api.internal.file.copy.NormalizingCopyActionDecorator$1$1.processFile(NormalizingCopyActionDecorator.java:66)
	at org.gradle.api.internal.file.copy.DuplicateHandlingCopyActionDecorator$1$1.processFile(DuplicateHandlingCopyActionDecorator.java:60)
	at org.gradle.api.internal.file.copy.CopyFileVisitorImpl.processFile(CopyFileVisitorImpl.java:62)
	at org.gradle.api.internal.file.copy.CopyFileVisitorImpl.visitFile(CopyFileVisitorImpl.java:46)
	at org.gradle.api.internal.file.collections.AbstractSingletonFileTree.visit(AbstractSingletonFileTree.java:36)
	at org.gradle.api.internal.file.collections.FileTreeAdapter.visit(FileTreeAdapter.java:118)
	at org.gradle.api.internal.file.CompositeFileTree.visit(CompositeFileTree.java:93)
	at org.gradle.api.internal.file.copy.CopySpecActionImpl.execute(CopySpecActionImpl.java:39)
	at org.gradle.api.internal.file.copy.CopySpecActionImpl.execute(CopySpecActionImpl.java:24)
	at org.gradle.api.internal.file.copy.DefaultCopySpec$DefaultCopySpecResolver.walk(DefaultCopySpec.java:693)
	at org.gradle.api.internal.file.copy.DefaultCopySpec$DefaultCopySpecResolver.walk(DefaultCopySpec.java:695)
	at org.gradle.api.internal.file.copy.DefaultCopySpec.walk(DefaultCopySpec.java:499)
	at org.gradle.api.internal.file.copy.CopySpecBackedCopyActionProcessingStream.process(CopySpecBackedCopyActionProcessingStream.java:38)
	at org.gradle.api.internal.file.copy.DuplicateHandlingCopyActionDecorator$1.process(DuplicateHandlingCopyActionDecorator.java:44)
	at org.gradle.api.internal.file.copy.NormalizingCopyActionDecorator$1.process(NormalizingCopyActionDecorator.java:57)
	at org.gradle.api.internal.file.copy.CopyActionProcessingStream$process.call(Unknown Source)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$2.execute(ShadowCopyAction.groovy:110)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction$2$execute.call(Unknown Source)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction.withResource(ShadowCopyAction.groovy:152)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:104)
	at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite$StaticMetaMethodSiteNoUnwrapNoCoerce.invoke(StaticMetaMethodSite.java:151)
	at org.codehaus.groovy.runtime.callsite.StaticMetaMethodSite.callStatic(StaticMetaMethodSite.java:102)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:216)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowCopyAction.execute(ShadowCopyAction.groovy:107)
	at org.gradle.api.internal.file.copy.NormalizingCopyActionDecorator.execute(NormalizingCopyActionDecorator.java:53)
	at org.gradle.api.internal.file.copy.DuplicateHandlingCopyActionDecorator.execute(DuplicateHandlingCopyActionDecorator.java:42)
	at org.gradle.api.internal.file.copy.CopyActionExecuter.execute(CopyActionExecuter.java:40)
	at org.gradle.api.tasks.AbstractCopyTask.copy(AbstractCopyTask.java:179)
	at com.github.jengelman.gradle.plugins.shadow.tasks.ShadowJar.copy(ShadowJar.java:96)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:48)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41)
	at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:705)
	at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:672)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$4.run(ExecuteActionsTaskExecuter.java:338)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:327)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:312)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:75)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:158)
	at org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:46)
	at org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69)
	at org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49)
	at org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34)
	at org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:49)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:42)
	at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:28)
	at org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:133)
	at org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$5(CacheStep.java:83)
	at java.util.Optional.orElseGet(Optional.java:267)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82)
	at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:37)
	at org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38)
	at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:95)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$1(SkipUpToDateStep.java:90)
	at java.util.Optional.orElseGet(Optional.java:267)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:90)
	at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)
	at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)
	at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
	at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
	at org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
	at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
	at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
	at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
	at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
	at org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
	at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
	at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
	at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
	at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
	at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
	at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
	at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
	at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
	at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
	at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
	at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
	at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
	at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
	at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
	at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
	at java.lang.Thread.run(Thread.java:748)

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest FAILED
Installing collected packages: crcmod, dill, fastavro, docopt, certifi, idna, chardet, urllib3, requests, hdfs, httplib2, pbr, funcsigs, mock, numpy, pymongo, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, python-dateutil, pytz, avro, pyvcf, pyarrow, typing-extensions, cachetools, monotonic, fasteners, google-apitools, google-auth, googleapis-common-protos, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, google-cloud-spanner, grpcio-gcp, google-cloud-dlp, google-cloud-language, google-cloud-videointelligence, google-cloud-vision, proto-google-cloud-datastore-v1, googledatastore, freezegun, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, tenacity, atomicwrites, packaging, attrs, wcwidth, more-itertools, pytest, pytest-forked, apipkg, execnet, pytest-xdist, pytest-timeout, apache-beam
ERROR: Could not install packages due to an EnvironmentError: [Errno 28] No space left on device


FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 7s
62 actionable tasks: 44 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/xo5jeyhx7eaha

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #44

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/44/display/redirect?page=changes>

Changes:

[filiperegadas] Add BigQuery useAvroLogicalTypes option

[filiperegadas] fixup! Add BigQuery useAvroLogicalTypes option


------------------------------------------
[...truncated 5.86 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputa67fb4e6-c2f6-4d4a-b6b0-db83ba64034e"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputa67fb4e6-c2f6-4d4a-b6b0-db83ba64034e", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputa67fb4e6-c2f6-4d4a-b6b0-db83ba64034e", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-05T06:09:30.883656Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-04_22_09_29-15008864655534267235'
 location: u'us-central1'
 name: u'beamapp-jenkins-0305060910-294958'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-05T06:09:30.883656Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-04_22_09_29-15008864655534267235]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_09_29-15008864655534267235?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-04_22_09_29-15008864655534267235 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:29.404Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-04_22_09_29-15008864655534267235. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:29.404Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:29.404Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-04_22_09_29-15008864655534267235.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:33.962Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:35.204Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:35.738Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:35.777Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:35.845Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:35.889Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:35.920Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:35.956Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:35.988Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:36.035Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:36.063Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:36.098Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:36.140Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:36.170Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:36.209Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:36.245Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:09:46.639Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:10:06.104Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:10:06.176Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:10:06.214Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T06:10:29.955Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-04_22_09_29-15008864655534267235 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2145.564s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_09_28-6235095680323314223?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_18_28-5816072143313838576?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_28_05-1162210595765973150?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_36_41-13710395477161125185?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_09_29-15008864655534267235?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_17_56-18420701984220122875?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_26_48-5693441324974103933?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_09_30-11517657047387259717?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_18_26-15326904737208437696?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_27_43-14722190672063844781?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_09_28-17636901866625362292?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_17_59-6198697534041453715?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_26_40-16185043602908584485?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_09_25-6117733040743482402?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_17_52-6579685829061191743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_26_48-14652462365908659781?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_09_32-16602994582045690363?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_18_00-9479736159451361724?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_27_24-11600281722126292446?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_09_31-14975624188688654772?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_18_27-11625201814306167636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_09_27-3740183360861470985?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_17_30-10143570653864952873?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_22_26_20-5474885826200421738?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 30s
64 actionable tasks: 48 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/vabsytts53wvs

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Py_VR_Dataflow_V2 - Build # 43 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_Py_VR_Dataflow_V2 (build #43)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/43/ to view the results.

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #42

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/42/display/redirect?page=changes>

Changes:

[jvilcek] [BEAM-9360] Fix equivalence check for FieldType


------------------------------------------
[...truncated 5.73 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input50309d4d-b0f6-4b9c-b447-d6098712d223"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input50309d4d-b0f6-4b9c-b447-d6098712d223", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output50309d4d-b0f6-4b9c-b447-d6098712d223", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-05T02:19:17.815545Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-04_18_19_16-15245954841552869018'
 location: u'us-central1'
 name: u'beamapp-jenkins-0305021833-127035'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-05T02:19:17.815545Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-04_18_19_16-15245954841552869018]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_19_16-15245954841552869018?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-04_18_19_16-15245954841552869018 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:16.152Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-04_18_19_16-15245954841552869018. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:16.152Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:16.152Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-04_18_19_16-15245954841552869018.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:19.890Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.036Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.634Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.672Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.728Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.762Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.796Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.819Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.857Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.920Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.953Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:21.996Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:22.034Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:22.066Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:22.105Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:22.140Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:32.352Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:32.389Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:32.428Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:46.490Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T02:19:57.117Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-04_18_19_16-15245954841552869018 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2198.720s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_18_55-13311535604581420161?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_28_23-8495430847586401115?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_36_47-12312358495399042784?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_45_27-4228147902906688195?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_18_55-16437180039778689522?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_27_12-16375062967467693156?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_35_40-14781424366962449206?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_19_16-15245954841552869018?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_26_38-6286350138017533579?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_35_05-16535498769695471506?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_18_57-18198531033542900894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_27_33-7908823010542893855?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_18_53-3079974271806000745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_26_57-3770029001847355728?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_34_18-4539537627356271508?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_18_53-8781826092858913222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_27_03-1728176694908519303?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_35_56-7085840235770842586?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_18_56-13909474357685325742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_27_37-18186028266915586779?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_35_54-528039054410295741?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_18_52-16348638089271675939?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_27_54-9982397356700697234?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_18_36_23-8593112864698159181?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 12m 4s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/aqo73pofdkca2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #41

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/41/display/redirect?page=changes>

Changes:

[boyuanz] Update verify_release_build script to run python tests with dev version.


------------------------------------------
[...truncated 5.76 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input3c1ee4ce-26ed-4d4f-874b-9b1ea2122ca9"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input3c1ee4ce-26ed-4d4f-874b-9b1ea2122ca9", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output3c1ee4ce-26ed-4d4f-874b-9b1ea2122ca9", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-05T00:59:10.115582Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-04_16_59_08-13898668495802438395'
 location: u'us-central1'
 name: u'beamapp-jenkins-0305005852-427494'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-05T00:59:10.115582Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-04_16_59_08-13898668495802438395]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_16_59_08-13898668495802438395?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-04_16_59_08-13898668495802438395 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:08.836Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:08.836Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-04_16_59_08-13898668495802438395. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:08.836Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-04_16_59_08-13898668495802438395.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:13.080Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:14.350Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:14.955Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:14.983Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.063Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.096Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.124Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.145Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.169Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.212Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.232Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.256Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.288Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.313Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.344Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:15.372Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:52.238Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:55.573Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:55.605Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T00:59:55.638Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-05T01:00:19.740Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-04_16_59_08-13898668495802438395 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2044.672s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_16_59_06-15094065714640621944?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_07_58-15122830693114908534?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_16_42-4615732854987095263?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_25_02-6252487527232887505?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_16_59_05-8001856396896492551?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_07_36-16564411523802170710?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_16_36-6242603162854919965?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_16_59_08-13898668495802438395?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_07_01-13020517408203231471?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_14_41-11101911548427970767?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_16_59_07-14548121626453335366?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_07_30-14348335837094098831?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_15_50-12706643526831725073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_16_59_07-8570344115647390948?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_07_43-912845718828141008?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_16_59_06-16867778906419224001?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_07_19-2883422224321053795?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_15_33-553982504030810136?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_16_59_07-16514167015682648743?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_07_33-9576346054887912100?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_15_37-15566493216541390078?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_16_59_08-8390943185722999235?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_07_20-7012663258065688950?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_17_15_41-1044627153222421382?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 8m 37s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/iwjjy5upli5uo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #40

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/40/display/redirect?page=changes>

Changes:

[ehudm] [BEAM-3713] Remove nosetests from tox.ini

[github] [BEAM-6374] Emit PCollection metrics from GoSDK (#10942)


------------------------------------------
[...truncated 5.81 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputf0786f86-a937-4340-9d04-df0c352de2c0"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputf0786f86-a937-4340-9d04-df0c352de2c0", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputf0786f86-a937-4340-9d04-df0c352de2c0", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-04T21:55:42.972162Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-04_13_55_41-1468646433356882277'
 location: u'us-central1'
 name: u'beamapp-jenkins-0304215526-887906'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-04T21:55:42.972162Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-04_13_55_41-1468646433356882277]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_13_55_41-1468646433356882277?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-04_13_55_41-1468646433356882277 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:41.469Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-04_13_55_41-1468646433356882277.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:41.469Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:41.469Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-04_13_55_41-1468646433356882277. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:45.052Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:46.357Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:46.915Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:46.945Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.013Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.044Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.070Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.097Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.120Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.171Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.203Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.233Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.267Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.291Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.310Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:47.332Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:49.549Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:49.574Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:55:49.602Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:56:03.089Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T21:56:13.911Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-04_13_55_41-1468646433356882277 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2168.049s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_13_55_40-2221377313429219733?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_05_15-7931838960717832257?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_13_26-11861064354801985739?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_13_55_38-2041250515606479576?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_03_45-16422842363975048401?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_13_58-13247366263375350012?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_22_39-3431525389626067921?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_13_55_43-3397166799912103439?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_04_01-13919720866722533200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_13_56-15325740025518898984?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_13_55_39-6010653581097941699?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_04_46-5152671941827698250?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_12_56-6143133383422115557?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_13_55_41-1468646433356882277?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_02_46-6408930337922555874?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_12_07-14483766687076265369?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_13_55_41-6793693781631284876?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_04_31-1716967519985541106?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_12_45-11184667822780633510?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_13_55_42-16164122202733339465?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_05_20-3834285809514292644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_13_55_39-15931004275868960961?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_04_45-8796617021426516152?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_14_12_59-3847420502729184013?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 42s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/3nkxlolnmjt5u

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #39

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/39/display/redirect?page=changes>

Changes:

[github] Merge pull request #11025: [BEAM-6428] Improve select performance with


------------------------------------------
[...truncated 5.73 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input77f79baf-f855-4f1e-8ec3-8ba04727923e"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input77f79baf-f855-4f1e-8ec3-8ba04727923e", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output77f79baf-f855-4f1e-8ec3-8ba04727923e", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-04T20:28:25.526502Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-04_12_28_23-6962712671740759323'
 location: u'us-central1'
 name: u'beamapp-jenkins-0304202804-575085'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-04T20:28:25.526502Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-04_12_28_23-6962712671740759323]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_28_23-6962712671740759323?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-04_12_28_23-6962712671740759323 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:23.367Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:23.367Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-04_12_28_23-6962712671740759323.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:23.367Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-04_12_28_23-6962712671740759323. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:35.009Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:36.170Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:36.937Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:36.976Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.062Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.309Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.478Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.540Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.585Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.654Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.701Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.736Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.783Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.818Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.857Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:37.897Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:48.019Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:48.175Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:28:48.248Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:29:01.321Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T20:29:14.714Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-04_12_28_23-6962712671740759323 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2154.530s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_28_24-3115408544130947752?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_38_14-7299854674064938225?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_46_57-12157013286800453035?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_55_34-8638929658734679924?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_28_24-730983118713179980?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_37_15-3886201878312176674?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_46_08-14070376149321898742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_28_23-6962712671740759323?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_35_57-10828642263790674611?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_44_00-15899262372259612659?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_28_25-12172041390804335322?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_38_12-776894692372594597?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_46_48-5941366743487772791?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_28_20-9999889087741367125?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_36_39-11115165173772867483?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_45_17-6880738429152446664?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_28_25-8447581671553382556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_37_33-17243871068887909099?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_28_21-6205186022365713800?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_36_33-4988719722498109438?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_45_05-5607213397217763898?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_28_21-16123429381612312424?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_36_55-7384660512860408041?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_12_45_23-6631180185088100726?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 10m 2s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ft62tzvodsype

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #38

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/38/display/redirect?page=changes>

Changes:

[rohde.samuel] ReverseTestStream Implementation


------------------------------------------
[...truncated 5.73 MB...]
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputd52c76f1-0b4c-463d-bfc7-5e67297d11e4"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputd52c76f1-0b4c-463d-bfc7-5e67297d11e4", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputd52c76f1-0b4c-463d-bfc7-5e67297d11e4", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-04T19:17:45.311868Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-04_11_17_44-8469788470556049449'
 location: u'us-central1'
 name: u'beamapp-jenkins-0304191727-987712'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-04T19:17:45.311868Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-04_11_17_44-8469788470556049449]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_17_44-8469788470556049449?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-04_11_17_44-8469788470556049449 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:44.204Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-04_11_17_44-8469788470556049449. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:44.204Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:44.204Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-04_11_17_44-8469788470556049449.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:54.440Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:57.764Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.341Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.375Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.455Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.511Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.552Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.588Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.626Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.696Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.727Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.763Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.815Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.851Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.899Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:17:58.943Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:18:26.823Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:18:34.361Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:18:34.402Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T19:18:34.445Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-04_11_17_44-8469788470556049449 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2150.483s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_17_41-13344856306362701356?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_27_14-12537101361671919141?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_35_45-18259155361061453164?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_44_27-3934942960756395114?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_17_40-6579625613637112264?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_26_06-2766416176500045417?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_34_17-10574663050835801424?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_17_44-8469788470556049449?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_26_07-4841241048211798940?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_17_43-3269562621612846922?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_25_54-14379426588275183986?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_34_01-16651866241787259244?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_17_39-3058463194971338427?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_25_57-6878891632148866506?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_34_37-3888823284833058266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_17_41-11910513570429050532?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_26_03-15893689291980235168?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_34_36-1835711416915188021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_17_42-1652544172187538664?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_26_35-10105232523362714965?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_34_20-4195450019415040541?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_17_44-15132484439154104056?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_26_14-17404673452179868452?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_11_35_34-3482330855577023727?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 9m 56s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/3hu6mvxir3tr2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #37

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/37/display/redirect?page=changes>

Changes:

[github] [BEAM-9432] Move expansion service into its own project. (#11035)


------------------------------------------
[...truncated 5.77 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input9f21fcfc-1c99-47a7-a216-1f42d9439fca"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input9f21fcfc-1c99-47a7-a216-1f42d9439fca", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output9f21fcfc-1c99-47a7-a216-1f42d9439fca", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-04T17:57:09.253842Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-04_09_57_07-8810999401771546541'
 location: u'us-central1'
 name: u'beamapp-jenkins-0304175652-657262'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-04T17:57:09.253842Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-04_09_57_07-8810999401771546541]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_09_57_07-8810999401771546541?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-04_09_57_07-8810999401771546541 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:07.549Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:07.549Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-04_09_57_07-8810999401771546541. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:07.549Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-04_09_57_07-8810999401771546541.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:12.010Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:13.327Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:13.877Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:13.909Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:13.981Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.025Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.056Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.191Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.218Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.270Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.299Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.338Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.380Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.413Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.446Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:14.473Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:40.578Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:40.613Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:40.657Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:57:42.945Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T17:58:10.395Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-04_09_57_07-8810999401771546541 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2206.084s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_09_57_06-8188875066590386093?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_07_07-8978856786625677508?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_15_23-1243251042973909925?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_24_47-1563822586799043698?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_09_57_07-8810999401771546541?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_04_52-6980300222297820187?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_13_44-18065917769831069655?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_09_57_08-7943364749590930337?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_06_22-10315503374080496634?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_09_57_07-16233839503978995237?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_05_45-6381422530247851306?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_14_06-9158644860500425777?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_09_57_06-13036979454196899633?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_05_25-4209773800487484507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_13_27-3190049915693859096?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_09_57_07-1106467718903153428?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_06_23-7857595504187879701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_15_24-4554809791428033642?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_09_57_10-13286840042815899095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_05_54-11566098351012337165?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_15_11-15094281497382405656?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_09_57_07-17045598861209775691?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_05_24-16383645711634171897?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_10_13_25-15481452028156212942?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 19s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/7xeemn54ifhko

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #36

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/36/display/redirect>

Changes:


------------------------------------------
[...truncated 5.86 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputa7635acc-5077-4571-b47a-ba3cc83e5b76"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputa7635acc-5077-4571-b47a-ba3cc83e5b76", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputa7635acc-5077-4571-b47a-ba3cc83e5b76", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-04T12:39:49.127744Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-04_04_39_47-6391020433624240418'
 location: u'us-central1'
 name: u'beamapp-jenkins-0304123921-281948'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-04T12:39:49.127744Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-04_04_39_47-6391020433624240418]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_39_47-6391020433624240418?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-04_04_39_47-6391020433624240418 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:47.869Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-04_04_39_47-6391020433624240418. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:47.869Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:47.869Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-04_04_39_47-6391020433624240418.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:52.716Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:53.980Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:54.661Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:54.685Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:54.779Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:54.825Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:54.855Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:54.882Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:54.908Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:54.967Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:54.999Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:55.031Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:55.074Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:55.110Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:55.138Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:39:55.167Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:40:00.931Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:40:00.959Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:40:00.992Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:40:21.331Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T12:40:24.121Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-04_04_39_47-6391020433624240418 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2125.564s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_39_47-18179745422536231599?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_48_15-9155598234301742141?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_57_32-4296668008866841218?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_05_05_30-11329389543505713737?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_39_47-6391020433624240418?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_47_17-13621808627441909134?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_56_10-14313796860249413640?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_39_50-7873194402034486920?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_47_51-1932175145747020595?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_56_10-9691222142571750555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_39_50-14625445397150037202?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_47_54-6905938641623307518?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_56_58-17493530795698764433?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_39_45-14117358194548232716?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_47_48-5948716018305519374?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_56_36-8728573058495198107?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_39_52-17751161697882284471?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_48_49-8105149472736404027?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_57_05-9875309794670831409?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_39_52-7035956694827636293?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_48_21-10742982495070827963?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_56_34-5079463759498081911?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_39_48-8403853508093794740?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-04_04_48_40-11693328706614828762?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 31s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/zkpfdcps7r26s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #35

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/35/display/redirect>

Changes:


------------------------------------------
[...truncated 5.87 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input676421a2-738c-4b15-b7a5-b2ede326c8c1"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input676421a2-738c-4b15-b7a5-b2ede326c8c1", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output676421a2-738c-4b15-b7a5-b2ede326c8c1", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-04T06:38:32.505079Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-03_22_38_30-18183170986381655048'
 location: u'us-central1'
 name: u'beamapp-jenkins-0304063813-421166'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-04T06:38:32.505079Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-03_22_38_30-18183170986381655048]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_38_30-18183170986381655048?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-03_22_38_30-18183170986381655048 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:31.025Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-03_22_38_30-18183170986381655048. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:31.025Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-03_22_38_30-18183170986381655048.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:31.025Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:34.925Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:36.409Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.017Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.050Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.124Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.172Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.200Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.244Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.275Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.345Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.387Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.419Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.466Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.499Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.542Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:37.577Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:43.695Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:43.730Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:38:43.778Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:39:03.185Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T06:39:11.499Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-03_22_38_30-18183170986381655048 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2049.587s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_38_33-969745013137221049?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_46_43-7077231754544509683?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_55_45-2157612147391653255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_23_04_04-11615644292897592641?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_38_29-8986182521428067356?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_46_49-1134181299008518073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_54_53-9869812222376782284?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_38_30-18183170986381655048?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_46_12-5755095260885428484?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_55_00-14352939456505223448?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_38_31-17416586468230222840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_48_06-8397169078654694502?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_38_31-14448629841981598796?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_46_37-15084910147464902565?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_55_21-8728965490516450554?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_38_30-12018297693930110375?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_47_04-13316214696514110960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_54_47-14898615707514528164?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_38_33-7237010634246369197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_46_57-16780567001062060872?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_55_25-2693791738811265581?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_38_30-13517050563401456690?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_47_00-6668650280592231975?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_22_55_35-8264959780278195895?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 9m 17s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/n2wv4qkltrquu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #34

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/34/display/redirect?page=changes>

Changes:

[heejong] [BEAM-9415] fix postcommit xvr tests


------------------------------------------
[...truncated 5.87 MB...]
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input8eb8708f-178e-4e6d-9eb2-2387dd33a1fa"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input8eb8708f-178e-4e6d-9eb2-2387dd33a1fa", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output8eb8708f-178e-4e6d-9eb2-2387dd33a1fa", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-04T02:09:05.276792Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-03_18_09_04-5702776108108427904'
 location: u'us-central1'
 name: u'beamapp-jenkins-0304020838-895028'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-04T02:09:05.276792Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-03_18_09_04-5702776108108427904]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_09_04-5702776108108427904?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-03_18_09_04-5702776108108427904 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:04.107Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-03_18_09_04-5702776108108427904. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:04.107Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:04.107Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-03_18_09_04-5702776108108427904.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:16.714Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:18.363Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.140Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.182Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.258Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.301Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.337Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.383Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.426Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.478Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.515Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.605Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.656Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.693Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.734Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:19.769Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:09:37.244Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:10:07.820Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:10:07.849Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T02:10:07.894Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-03_18_09_04-5702776108108427904 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2236.317s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_09_13-14455022704987799447?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_19_09-1536731302355670988?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_27_59-12880654257197013158?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_36_26-5434408485502627?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_09_01-8355950970212933787?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_18_14-3171027169980188528?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_25_55-4771918715645228695?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_09_04-5702776108108427904?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_17_42-3589543845057286102?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_09_06-10160142870182370373?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_17_27-2282565433751283379?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_25_48-3014316808622202255?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_09_05-10092117730064690959?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_17_08-14980830533099953786?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_25_12-11442628641588562585?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_09_00-10388286912351474604?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_17_14-14099007157058860658?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_25_39-8681323807215953386?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_09_07-7547071121907770718?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_17_51-8829358473805017964?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_26_51-1080104782681465771?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_09_04-14316118462261956062?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_17_39-6541532595294181993?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_18_26_58-2313783318281903043?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 13m 51s
62 actionable tasks: 51 executed, 11 from cache

Publishing build scan...
https://gradle.com/s/6ims3ojfstfr2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #33

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/33/display/redirect?page=changes>

Changes:

[github] [BEAM-9413] fix beam_PostCommit_Py_ValCon (#11023)


------------------------------------------
[...truncated 5.76 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input504d8481-08b1-4239-90f3-5c7536b932c1"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input504d8481-08b1-4239-90f3-5c7536b932c1", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output504d8481-08b1-4239-90f3-5c7536b932c1", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-04T00:40:39.969480Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-03_16_40_38-8629334818923298643'
 location: u'us-central1'
 name: u'beamapp-jenkins-0304004022-591357'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-04T00:40:39.969480Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-03_16_40_38-8629334818923298643]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_40_38-8629334818923298643?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-03_16_40_38-8629334818923298643 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:38.607Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-03_16_40_38-8629334818923298643.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:38.607Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:38.607Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-03_16_40_38-8629334818923298643. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:42.904Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:44.351Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:44.971Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:44.997Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.061Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.106Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.141Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.182Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.205Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.256Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.287Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.320Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.357Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.394Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.432Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:45.466Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:56.175Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:56.215Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:40:56.258Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:41:05.875Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-04T00:41:20.482Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-03_16_40_38-8629334818923298643 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2125.174s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_40_43-12040085990953384501?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_49_40-17453212703075717803?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_58_42-17518678542579311289?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_17_07_58-11734381362167776200?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_40_39-14414677003381036609?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_48_44-11011023550751356477?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_56_54-1187286717856138111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_40_38-8629334818923298643?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_48_29-4219453681764181437?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_57_56-15195223775341075499?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_40_41-17206704253794034511?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_48_33-2412083645850344677?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_57_08-11732221536278266411?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_40_47-6328859812851564457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_49_02-4355954479402919363?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_58_08-17001872099586931120?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_40_36-11281276257039791439?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_50_30-6294141748257896608?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_58_37-15938577469244091463?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_40_42-744562241335887497?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_49_58-9379217993623674939?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_40_38-9122812848533267779?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_48_53-12465289783316618602?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_16_57_14-4085412309677185364?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 36s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/5ciqsrb4icxf6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #32

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/32/display/redirect?page=changes>

Changes:

[iemejia] [website] Update link to environment_type (SDK harness configuration)

[iemejia] Fix typo on python code

[rohde.samuel] [BEAM-8335] Add PCollection to Dataframe logic for InteractiveRunner.

[github] [BEAM-8575] Modified trigger test to work for different runners.


------------------------------------------
[...truncated 5.73 MB...]
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputf3ba2af8-b826-4230-8ce4-36a70bfd0e74"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputf3ba2af8-b826-4230-8ce4-36a70bfd0e74", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputf3ba2af8-b826-4230-8ce4-36a70bfd0e74", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-03T23:15:20.465676Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-03_15_15_19-6659480555642707880'
 location: u'us-central1'
 name: u'beamapp-jenkins-0303231504-502656'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-03T23:15:20.465676Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-03_15_15_19-6659480555642707880]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_15_19-6659480555642707880?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-03_15_15_19-6659480555642707880 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:19.206Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:19.206Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-03_15_15_19-6659480555642707880.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:19.206Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-03_15_15_19-6659480555642707880. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:23.059Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:24.642Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.414Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.452Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.527Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.569Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.614Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.656Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.691Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.748Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.787Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.830Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.875Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.912Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:25.942Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:26.008Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:29.244Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:29.279Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:29.317Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:15:54.448Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:16:02.608Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:16:21.200Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T23:16:21.241Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-03_15_15_19-6659480555642707880 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2079.998s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_15_27-14730071024555954879?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_24_38-11660915031125721408?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_33_15-8046592826861785152?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_41_35-6227763983433101451?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_15_19-6022224262772349431?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_23_42-13457120183868755564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_33_07-5414720424696921595?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_15_19-6659480555642707880?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_22_21-451596195288837989?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_30_44-3697648070069739778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_15_28-6330243331174045561?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_23_41-5268485157983661652?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_32_48-13408794142144450594?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_15_21-6974263910356317007?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_24_02-1851009578535088721?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_32_21-2972451484108634778?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_15_18-4556625034518250100?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_24_10-617303975666433334?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_32_28-10436123436074745490?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_15_21-5012892998382680583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_24_06-15362866165266559069?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_15_21-7588839014605587774?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_23_45-7430687244648167307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_15_32_39-16072608919051860213?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 9m 45s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/c5tnrkhs3aqjo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #31

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/31/display/redirect?page=changes>

Changes:

[chadrik] [BEAM-7746] Runtime change to timestamp/duration equality

[lcwik] [BEAM-9288] Bump version number vendored gRPC build.


------------------------------------------
[...truncated 5.72 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2045f1cc-2d19-4a02-a1ea-361b789975ef"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2045f1cc-2d19-4a02-a1ea-361b789975ef", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output2045f1cc-2d19-4a02-a1ea-361b789975ef", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-03T20:19:54.075712Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-03_12_19_51-15239943861359988035'
 location: u'us-central1'
 name: u'beamapp-jenkins-0303201933-396757'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-03T20:19:54.075712Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-03_12_19_51-15239943861359988035]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_19_51-15239943861359988035?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-03_12_19_51-15239943861359988035 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:19:51.190Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-03_12_19_51-15239943861359988035. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:19:51.190Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:19:51.190Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-03_12_19_51-15239943861359988035.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:19:57.389Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:19:59.458Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-b.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:00.237Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:00.309Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:00.473Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:00.596Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:00.716Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:00.851Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:00.940Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:01.223Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:01.286Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:01.364Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:01.479Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:01.545Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:01.605Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:01.681Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:04.080Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:04.121Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:04.162Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:19.597Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T20:20:37.183Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-03_12_19_51-15239943861359988035 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2204.333s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_19_51-15239943861359988035?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_28_02-3824453088001350993?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_38_07-9118292019071344749?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_47_13-12405143790004230586?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_19_55-12762477385706552668?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_29_10-11092420633230527857?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_37_00-5779103672587377251?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_19_49-5096261310786494031?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_27_44-13807096686506877203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_36_44-7101992719937395065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_19_48-2745510380031893304?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_29_55-10856400871925523362?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_19_49-8527805572333688717?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_28_51-17216305910063232890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_37_17-3336226153526316072?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_19_49-14390460836552358146?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_28_23-1926799252041743515?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_36_38-4013292571885671728?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_19_49-16755396238975756734?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_28_41-13138498020806417403?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_37_24-8916776460938901334?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_19_48-1908705643664332154?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_29_05-8160178186176279806?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_12_37_05-12150970026898524698?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 43s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/qht5tgv3reoqc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #30

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/30/display/redirect?page=changes>

Changes:

[fernandodiaz] [BEAM-9424] Allow grouping by LogicalType

[echauchot] Add metrics export to documentation on the website.

[github] [BEAM-8382] Add rate limit policy to KinesisIO.Read (#9765)


------------------------------------------
[...truncated 5.74 MB...]
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputdff2f4f0-ac9a-4d3f-9724-d4dacbb68f57"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputdff2f4f0-ac9a-4d3f-9724-d4dacbb68f57", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputdff2f4f0-ac9a-4d3f-9724-d4dacbb68f57", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-03T19:01:43.061511Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-03_11_01_41-16733475253365558757'
 location: u'us-central1'
 name: u'beamapp-jenkins-0303190123-644422'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-03T19:01:43.061511Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-03_11_01_41-16733475253365558757]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_01_41-16733475253365558757?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-03_11_01_41-16733475253365558757 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:41.613Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-03_11_01_41-16733475253365558757. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:41.613Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-03_11_01_41-16733475253365558757.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:41.613Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:45.236Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:46.490Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.059Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.093Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.180Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.224Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.258Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.304Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.338Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.387Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.424Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.459Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.500Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.527Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.558Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:01:47.598Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:02:05.206Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:02:05.240Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:02:05.285Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:02:22.942Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T19:02:29.809Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-03_11_01_41-16733475253365558757 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2174.107s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_01_39-11697606897889223464?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_11_30-1315952827597940316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_20_34-5169290456040142812?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_28_58-14981449969158328352?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_01_38-11497576829870670259?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_10_04-11843513072009184066?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_18_28-5150318939580555701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_01_41-16733475253365558757?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_09_08-8978400576668332596?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_18_02-3191183913805882231?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_01_40-2472286611648606373?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_09_10-3661137777225065878?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_16_58-10446341833658339839?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_01_36-11363318276840274481?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_10_45-5316062307181549445?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_19_05-9585118059780663968?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_01_39-601601091089776915?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_09_59-6991922226656570036?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_18_31-15951443256431500329?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_01_38-7344098939753200788?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_10_38-770832700986524584?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_01_40-15117189517868530894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_10_34-9968501711848307482?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_11_19_21-15537600795813411447?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 11m 25s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/xivwtm6dzlo7s

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #29

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/29/display/redirect?page=changes>

Changes:

[kamil.wasilewski] Add integration test for AnnotateImage transform

[kamil.wasilewski] Fix: skip test if GCP dependencies are not installed


------------------------------------------
[...truncated 5.63 MB...]
    ]
  }, 
  "name": "beamapp-jenkins-0303172717-162213", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input7e95ccb4-a611-4169-8a3e-15b1c3eb12aa"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input7e95ccb4-a611-4169-8a3e-15b1c3eb12aa", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output7e95ccb4-a611-4169-8a3e-15b1c3eb12aa", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-03-03T17:27:38.633405Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-03-03_09_27_37-4462231857176568391'
 location: u'us-central1'
 name: u'beamapp-jenkins-0303172717-162213'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-03-03T17:27:38.633405Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-03-03_09_27_37-4462231857176568391]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_27_37-4462231857176568391?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-03-03_09_27_37-4462231857176568391 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:37.312Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-03-03_09_27_37-4462231857176568391. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:37.312Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-03-03_09_27_37-4462231857176568391.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:37.312Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:40.941Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:42.046Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:42.647Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:42.682Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:42.765Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:42.826Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:42.865Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:42.890Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:42.924Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:43.001Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:43.035Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:43.059Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:43.102Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:43.145Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:43.179Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:27:43.211Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-03-03T17:28:12.471Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-03-03_09_27_37-4462231857176568391 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 28 tests in 2438.334s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_27_32-1002054247720515246?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_35_43-6695857659580164960?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_51_46-7361615604204896559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_59_31-8639873168498207160?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_27_37-4462231857176568391?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_36_08-1334396526047566392?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_50_47-13578637360456649749?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_27_32-3807945695197777127?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_35_56-14627229015108236955?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_50_33-18028379469680363641?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_27_34-12314908141898629509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_37_11-4879845241838220517?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_50_49-4674906675050811183?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_27_35-15288871260485626949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_38_51-9008248194485841167?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_50_40-17678106600168360839?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_27_30-1702617831071914995?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_36_11-8175925989709640971?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_27_36-2352777028253210199?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_36_38-10898875284631373935?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_50_24-10898163986845837382?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_27_33-5515650607892683020?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_36_24-7631150879718912767?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-03-03_09_51_36-15768509218888337061?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 113

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 15m 5s
62 actionable tasks: 45 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/wiwd2sjafibnu

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org