You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/04/09 13:08:32 UTC

Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #327

See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/327/display/redirect?page=changes>

Changes:

[michael.jacoby] [BEAM-9647] fixes MQTT clientId to long


------------------------------------------
[...truncated 5.42 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0409123124-344872.1586435484.345133/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0409123124-344872", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb424cae0-b841-4211-96ce-c04e6687d93b"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb424cae0-b841-4211-96ce-c04e6687d93b", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputb424cae0-b841-4211-96ce-c04e6687d93b", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-09T12:31:39.171255Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_05_31_38-11111221124247775621'
 location: u'us-central1'
 name: u'beamapp-jenkins-0409123124-344872'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-09T12:31:39.171255Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_05_31_38-11111221124247775621]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_31_38-11111221124247775621?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_05_31_38-11111221124247775621 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:38.024Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:38.024Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_05_31_38-11111221124247775621. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:38.024Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_05_31_38-11111221124247775621.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:41.549Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:42.246Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:42.844Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:42.878Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:42.937Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:42.977Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.017Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.039Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.065Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.119Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.156Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.192Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.230Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.271Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.305Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:43.330Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:45.640Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:45.678Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:31:45.714Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:32:00.106Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T12:32:13.314Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_05_31_38-11111221124247775621 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2229.809s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_31_36-14320863212688272464?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_40_24-1551393118034793851?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_50_07-16400931160496899029?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_58_55-4127532802486497937?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_31_37-15579171920633149770?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_40_21-12599035785315647816?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_50_00-15007792031826873884?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_31_38-11111221124247775621?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_39_38-11566554763332681287?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_49_15-1278905031951718860?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_31_38-1669478339918267077?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_40_26-17759922889309050180?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_49_23-5737058542119436663?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_31_38-2498499998336545405?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_41_31-13191605477121459074?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_31_38-1810469376050132447?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_40_23-9208419293536131790?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_49_21-7236713601720070308?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_31_39-7808632348005711073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_40_21-2412822613791972644?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_48_49-17688273577717994842?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_31_36-8062438589643130666?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_40_22-17640029376893713196?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_05_48_39-16623986495208265192?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 14s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/urxw2e2jo4ajo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Py_VR_Dataflow_V2 #344

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/344/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #343

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/343/display/redirect?page=changes>

Changes:

[github] [BEAM-9443] support direct_num_workers=0 (#11372)


------------------------------------------
[...truncated 5.44 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0411004942-220410.1586566182.220549/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0411004942-220410", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputaa8673ea-0873-429a-b07f-a9405dc4d7ad"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputaa8673ea-0873-429a-b07f-a9405dc4d7ad", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputaa8673ea-0873-429a-b07f-a9405dc4d7ad", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-11T00:49:56.347982Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-10_17_49_55-10099482871119742387'
 location: u'us-central1'
 name: u'beamapp-jenkins-0411004942-220410'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-11T00:49:56.347982Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-10_17_49_55-10099482871119742387]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_49_55-10099482871119742387?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-10_17_49_55-10099482871119742387 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:49:55.288Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-10_17_49_55-10099482871119742387. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:49:55.288Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:49:55.288Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-10_17_49_55-10099482871119742387.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:49:59.532Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:00.622Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.478Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.512Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.602Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.643Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.676Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.719Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.752Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.808Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.845Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.884Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.924Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.961Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:01.990Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:02.029Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:23.748Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:36.022Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:36.055Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:36.089Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-11T00:50:59.100Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-10_17_49_55-10099482871119742387 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2447.666s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_49_56-3829465434647277484?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_59_55-16069485335955404240?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_10_53-14000007014313125339?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_21_01-10695419932106326452?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_49_54-12181901436121603081?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_00_22-3269628360180372372?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_09_55-1603894657264751275?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_49_55-10099482871119742387?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_57_59-1506866460066652146?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_07_42-6487761657656715342?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_49_57-4619325766714333380?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_00_55-13667811791429410326?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_49_54-15331569102353478758?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_59_48-2756683646524054489?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_08_26-4097479794701336112?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_49_56-11618604366456418691?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_00_54-2214945810986079540?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_10_48-13542333954754659189?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_49_55-5378523414765424107?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_00_26-12587138076869983666?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_09_49-9246276142055232230?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_49_57-7112968632362740254?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_17_59_38-4434817291194104216?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_18_10_01-6936113372755508333?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 34s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/t6dgfj7v4az56

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #342

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/342/display/redirect?page=changes>

Changes:

[ankurgoenka] [BEAM-9735] Adding Always trigger and using it in Reshuffle

[boyuanz] [BEAM-9562] Update Element.timer, Element.Timer to Element.timers and


------------------------------------------
[...truncated 5.44 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410223719-965055.1586558239.965188/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410223719-965055", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input3c617590-f1b7-46c8-a3f7-27bfc6cd1de0"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input3c617590-f1b7-46c8-a3f7-27bfc6cd1de0", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output3c617590-f1b7-46c8-a3f7-27bfc6cd1de0", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T22:37:34.764555Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-10_15_37_33-11068556560292598338'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410223719-965055'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T22:37:34.764555Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-10_15_37_33-11068556560292598338]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_37_33-11068556560292598338?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-10_15_37_33-11068556560292598338 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:33.437Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-10_15_37_33-11068556560292598338.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:33.437Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:33.437Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-10_15_37_33-11068556560292598338. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:48.969Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:49.680Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.215Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.239Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.288Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.323Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.351Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.373Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.395Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.435Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.465Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.488Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.517Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.540Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.567Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:50.587Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:58.135Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:58.160Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:37:58.188Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:38:22.134Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T22:38:25.454Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-10_15_37_33-11068556560292598338 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2293.221s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_37_32-9510876708818747095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_47_49-16144474500612469406?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_56_57-16895723003282022150?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_16_06_00-11740710473933692035?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_37_33-11068556560292598338?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_46_15-14395034126926220928?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_56_14-13191559477590794730?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_37_33-16009529350145040822?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_47_28-71938820689900821?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_56_57-3645657121340176094?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_37_33-6717460104611752507?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_47_11-18325638805086623205?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_55_54-2554054144045054010?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_37_31-6073210548475040732?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_47_12-5034324591946135105?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_56_04-9430080664720325153?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_37_33-13916904117941316659?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_46_42-9841435260776053234?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_55_55-15590749826291094350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_37_34-16878509147309538502?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_47_04-12721412470842188193?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_37_32-2580134802684239120?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_47_15-3976263717104261296?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_15_55_52-13162946033396361331?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 45s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/g6i4i2ceoyopc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #341

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/341/display/redirect?page=changes>

Changes:

[robertwb] Attempt to stage resources via new API in portable runner.

[pabloem] Fix from_container_image call


------------------------------------------
[...truncated 5.43 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410205158-066807.1586551918.067009/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410205158-066807", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input422e180e-03c2-4712-aaa7-1af06ca4fbcf"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input422e180e-03c2-4712-aaa7-1af06ca4fbcf", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output422e180e-03c2-4712-aaa7-1af06ca4fbcf", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T20:52:13.667974Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-10_13_52_12-10771267790012963910'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410205158-066807'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T20:52:13.667974Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-10_13_52_12-10771267790012963910]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_52_12-10771267790012963910?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-10_13_52_12-10771267790012963910 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:12.582Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:12.582Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-10_13_52_12-10771267790012963910.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:12.582Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-10_13_52_12-10771267790012963910. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:17.909Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:18.659Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.474Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.508Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.581Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.617Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.650Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.684Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.708Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.771Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.812Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.849Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.889Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.931Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.967Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:19.995Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:25.350Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:25.386Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:25.425Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:30.343Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T20:52:51.256Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-10_13_52_12-10771267790012963910 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2262.480s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_52_12-2241852616570282764?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_00_32-8846340136394182377?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_11_20-6144693303885840159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_19_52-272920781110524077?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_52_11-13095261719813295890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_00_58-18374286459483668385?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_52_12-10771267790012963910?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_59_59-2144558353128705829?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_09_41-738131887529184143?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_52_10-8867159994597411204?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_01_19-3223407454669144364?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_10_22-5655367756815940567?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_52_12-7057644277159162865?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_01_30-5175033866977088858?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_10_34-52205297453004124?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_52_16-438342510291491658?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_00_51-5724898524725310636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_10_06-2158019178464486656?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_52_13-16412045523527140717?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_00_34-6133343635120958212?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_09_39-12745744897810698649?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_13_52_11-1711157197060577015?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_01_18-18346236278144775187?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_14_10_11-6713952001363603324?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 7s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/ny7h66tp2esyq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #340

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/340/display/redirect?page=changes>

Changes:

[veblush] Upgrades gcsio to 2.1.2

[github] Add --region to changelog


------------------------------------------
[...truncated 5.44 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410191129-157356.1586545889.157574/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410191129-157356", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputf7c2d916-4386-4ad9-8e94-fd24c4b1dd60"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputf7c2d916-4386-4ad9-8e94-fd24c4b1dd60", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputf7c2d916-4386-4ad9-8e94-fd24c4b1dd60", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T19:11:45.742133Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-10_12_11_44-11168909730183423833'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410191129-157356'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T19:11:45.742133Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-10_12_11_44-11168909730183423833]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_11_44-11168909730183423833?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-10_12_11_44-11168909730183423833 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:44.546Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-10_12_11_44-11168909730183423833. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:44.546Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-10_12_11_44-11168909730183423833.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:44.546Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:47.967Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:48.681Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.343Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.376Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.455Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.506Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.544Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.579Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.616Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.685Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.724Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.770Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.825Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.859Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.894Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:49.972Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:58.651Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:58.682Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:11:58.712Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:12:11.779Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T19:12:23.416Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-10_12_11_44-11168909730183423833 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2218.000s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_11_50-13490390579733643739?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_20_37-14542669923363397038?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_30_38-17807952529070305307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_39_54-11463456319556619018?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_11_44-11168909730183423833?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_19_17-147595939979523019?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_27_53-7754367921648567610?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_11_46-3047401127243509823?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_21_07-12393249637834495917?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_30_36-5768839773356416762?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_11_46-6868259628999334734?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_20_04-10441601613234788710?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_29_48-2771016834285145032?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_11_41-2533073544434565399?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_20_34-399123202068070749?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_30_01-15378674239614361428?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_11_45-17498140483194819012?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_21_06-7450574892422498745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_11_46-12425107711485660025?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_20_05-4824549877576563614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_29_51-15536393540860075097?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_11_43-10522732235071866555?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_21_11-5132698480061172613?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_12_29_49-1406314268918587658?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 15m 43s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/wuxr2n4szwb6i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #339

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/339/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-9651] Prevent StreamPool and stream initialization livelock


------------------------------------------
[...truncated 5.44 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410173209-898724.1586539929.898972/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410173209-898724", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input88ddca24-406c-46c1-a970-3b4b0265f243"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input88ddca24-406c-46c1-a970-3b4b0265f243", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output88ddca24-406c-46c1-a970-3b4b0265f243", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T17:32:25.972281Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-10_10_32_24-11675833156123819657'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410173209-898724'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T17:32:25.972281Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-10_10_32_24-11675833156123819657]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_32_24-11675833156123819657?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-10_10_32_24-11675833156123819657 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:32:24.840Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-10_10_32_24-11675833156123819657. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:32:24.840Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:32:24.840Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-10_10_32_24-11675833156123819657.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:14.925Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:15.742Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.321Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.345Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.406Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.435Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.467Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.487Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.509Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.554Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.574Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.601Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.634Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.657Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.679Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:16.700Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:24.165Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:24.197Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:24.228Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:51.950Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T17:33:58.938Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-10_10_32_24-11675833156123819657 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2369.585s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_32_25-14410505944517396227?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_42_22-5696261366176873318?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_51_55-9907596884531496028?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_11_01_01-8532662469480743996?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_32_23-9960094810891723627?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_41_21-17947340456725534339?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_50_10-13521019482656066476?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_32_24-11675833156123819657?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_41_14-10973590838536009251?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_50_42-6932755629433355978?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_32_24-9444442847439035949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_41_09-10148371795096512799?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_51_12-8228467726579557922?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_32_24-16311920620269229498?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_41_51-9625352401192325195?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_51_50-2142889409569519327?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_32_21-15132141007800117172?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_41_16-15249964083465063117?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_32_26-9705042662117252077?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_41_24-9680255761198123912?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_50_22-12252807608446685742?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_32_23-18143092971327667977?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_40_53-4832632247306182699?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_10_50_21-12212306223945893629?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 42s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/65qyfoh55wmoa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #338

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/338/display/redirect?page=changes>

Changes:

[michal.walenia] [BEAM-9734] Revert #11122


------------------------------------------
[...truncated 5.42 MB...]
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410152312-660480.1586532192.660706/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410152312-660480", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb5d32022-af13-4462-96b2-841d86b91975"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputb5d32022-af13-4462-96b2-841d86b91975", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputb5d32022-af13-4462-96b2-841d86b91975", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T15:23:38.623649Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-10_08_23_37-4086946159012012740'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410152312-660480'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T15:23:38.623649Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-10_08_23_37-4086946159012012740]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_23_37-4086946159012012740?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-10_08_23_37-4086946159012012740 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:37.577Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-10_08_23_37-4086946159012012740.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:37.577Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:37.577Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-10_08_23_37-4086946159012012740. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:40.803Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:41.686Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.167Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.203Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.277Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.308Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.339Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.372Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.393Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.463Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.492Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.513Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.556Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.589Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.623Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:42.655Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:23:52.350Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:24:27.564Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:24:27.605Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T15:24:27.650Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-10_08_23_37-4086946159012012740 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2374.394s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_23_49-5277762825911651256?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_33_55-6252041719509613141?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_43_52-435071769921023053?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_53_17-5498804076071392316?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_23_40-11085731284069104416?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_35_08-16697921210893372894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_23_37-4086946159012012740?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_32_08-7414371731238298252?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_42_06-2003235004115117259?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_23_45-8363474993688053557?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_33_00-8538510111459325901?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_42_21-15449589759715698884?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_23_45-17349036979966323645?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_33_35-7554951301445609287?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_42_07-886709273185199959?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_23_36-6359032845620754642?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_32_07-8333028578861035612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_42_01-2195543763547112590?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_23_46-5810920471199792132?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_33_49-7570350624101725911?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_42_37-15634520098648818967?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_23_41-2823706016817559321?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_33_33-17893291977617717159?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_08_43_02-17771619371216280732?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 35s
64 actionable tasks: 50 executed, 14 from cache

Publishing build scan...
https://gradle.com/s/b3zg2vgciahsk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #337

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/337/display/redirect>

Changes:


------------------------------------------
[...truncated 5.44 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410124016-920882.1586522416.921018/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410124016-920882", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputcecb32ed-e186-4e0a-990e-dbfc5888d203"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputcecb32ed-e186-4e0a-990e-dbfc5888d203", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputcecb32ed-e186-4e0a-990e-dbfc5888d203", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T12:40:31.982054Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-10_05_40_30-18337808113871484010'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410124016-920882'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T12:40:31.982054Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-10_05_40_30-18337808113871484010]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_40_30-18337808113871484010?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-10_05_40_30-18337808113871484010 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:30.945Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:30.945Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-10_05_40_30-18337808113871484010.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:30.945Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-10_05_40_30-18337808113871484010. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:34.606Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:35.275Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:35.821Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:35.851Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:35.940Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.037Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.075Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.108Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.192Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.226Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.269Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.334Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.370Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.404Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:36.437Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:40.339Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:40.378Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:40.431Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:40:55.192Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T12:41:04.736Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-10_05_40_30-18337808113871484010 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2467.155s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_40_27-7625890682879676610?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_51_19-11443346434747019976?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_06_00_57-16777523727097555807?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_06_12_25-14048766865280669884?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_40_29-3638581333227800800?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_50_11-13122394280791671417?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_59_14-1955330011696416604?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_40_30-18337808113871484010?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_48_15-8390524425837541722?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_56_56-6158022962781750779?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_40_30-1986450825154158750?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_50_09-17576850955891085488?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_59_37-14646276436240798144?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_40_30-15363985465146992933?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_50_33-11327516738850571632?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_06_00_02-7541898406601124220?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_40_30-2479787039876491345?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_48_50-998309504377524468?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_58_33-3782381108707463525?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_40_29-6221638474393909378?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_48_43-349407168238536273?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_57_20-2060469155357878487?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_40_30-6159119090476569025?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_05_50_40-6866798274324159304?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 20m 55s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/hg5fxvykpscs2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #336

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/336/display/redirect>

Changes:


------------------------------------------
[...truncated 5.44 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410064125-148697.1586500885.148905/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410064125-148697", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input93715e1e-9b16-4399-9bee-65acd72091ec"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input93715e1e-9b16-4399-9bee-65acd72091ec", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output93715e1e-9b16-4399-9bee-65acd72091ec", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T06:41:39.785697Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_23_41_38-8103910456084940393'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410064125-148697'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T06:41:39.785697Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_23_41_38-8103910456084940393]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_41_38-8103910456084940393?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_23_41_38-8103910456084940393 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:38.195Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_23_41_38-8103910456084940393. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:38.195Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_23_41_38-8103910456084940393.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:38.195Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:43.300Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:44.564Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.361Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.393Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.472Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.518Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.554Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.591Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.626Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.693Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.725Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.758Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.805Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.844Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.875Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:45.909Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:52.577Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:52.624Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:41:52.665Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:42:01.084Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T06:42:21.874Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_23_41_38-8103910456084940393 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2200.903s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_41_37-16039906162751287641?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_50_06-17503893190578067698?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_59_54-5259483307477077379?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-10_00_08_57-12408023038293364493?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_41_38-8103910456084940393?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_50_02-15127391304474081839?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_59_35-9590388701915311243?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_41_38-5541044225283079241?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_50_56-15530401311464954703?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_59_28-345079778547780077?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_41_39-7814083018151758478?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_50_22-12499731519609931574?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_59_20-9385792149291045687?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_41_37-13117077806599234773?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_50_22-1102525733091610789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_59_19-12391675102994947089?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_41_39-12711818276276184810?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_50_23-8614293993343094773?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_59_30-13589181015000699168?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_41_40-7258139819832102564?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_50_53-15847247245885095382?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_59_43-11415824953879731399?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_41_38-5018336089258487106?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_23_51_35-17712456763896456535?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 39s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/zqs65l3dhtypi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #335

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/335/display/redirect?page=changes>

Changes:

[kcweaver] Moving to 2.22.0-SNAPSHOT on master branch.


------------------------------------------
[...truncated 5.44 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410050818-395151.1586495298.395300/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410050818-395151", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input0bc54a9d-84eb-49e3-8b60-726b497b96ee"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input0bc54a9d-84eb-49e3-8b60-726b497b96ee", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output0bc54a9d-84eb-49e3-8b60-726b497b96ee", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T05:08:34.402791Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_22_08_33-14843833713901486858'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410050818-395151'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T05:08:34.402791Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_22_08_33-14843833713901486858]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_08_33-14843833713901486858?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_22_08_33-14843833713901486858 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:33.370Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:33.370Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_22_08_33-14843833713901486858. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:33.370Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_22_08_33-14843833713901486858.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:37.086Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.030Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.545Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.582Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.649Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.688Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.718Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.750Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.806Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.867Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.898Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.937Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:38.977Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:39.006Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:39.032Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:39.086Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:44.729Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:44.753Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:08:44.816Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:09:04.195Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T05:09:08.409Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_22_08_33-14843833713901486858 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2206.158s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_08_30-11485799154485419108?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_17_55-13607127344510699269?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_26_53-187277776431737995?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_36_15-15414518318161796206?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_08_33-14843833713901486858?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_16_05-13145056138247474657?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_25_09-17141468630360613983?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_08_34-15040011359894211801?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_16_35-12754191483647780097?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_25_24-11960138845630989995?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_08_31-6489689627075231064?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_16_37-6035773208206423576?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_25_34-17622817336179311412?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_08_29-11151659215577731190?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_16_23-11667477068542899745?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_26_21-8304900030553887384?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_08_35-5782801954077164145?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_18_13-3641340265198493534?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_26_49-520223574097166907?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_08_34-8360212696336914987?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_16_57-5840747948531965573?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_25_47-1432264165711427631?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_08_31-7845983140951580071?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_22_18_19-9844369497491227814?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 18m 54s
64 actionable tasks: 63 executed, 1 from cache

Publishing build scan...
https://gradle.com/s/fgdxobvrsdypy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #334

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/334/display/redirect?page=changes>

Changes:

[github] add missing bracket

[boyuanz] [BEAM-9562, BEAM-6274] Fix-up timers to use Elements.Timer proto in data

[robertwb] Allow unset write threshold for state backed iterable coder.

[github] Revert "[BEAM-9651] Prevent StreamPool and stream initialization

[github] [BEAM-9727] Automatically set required experiment flags for dataflow

[github] Update environments.py to add a method to specify container image


------------------------------------------
[...truncated 5.42 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410015131-383212.1586483491.383341/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410015131-383212", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input08376546-475f-46af-a01a-0a48812cd9c7"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input08376546-475f-46af-a01a-0a48812cd9c7", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output08376546-475f-46af-a01a-0a48812cd9c7", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T01:51:44.613644Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_18_51_43-762043860750894243'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410015131-383212'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T01:51:44.613644Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_18_51_43-762043860750894243]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_51_43-762043860750894243?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_18_51_43-762043860750894243 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:43.552Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_18_51_43-762043860750894243. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:43.552Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:43.552Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_18_51_43-762043860750894243.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:46.609Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:47.403Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.021Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.042Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.195Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.229Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.255Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.276Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.304Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.355Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.384Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.404Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.438Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.473Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.509Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:48.538Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:53.866Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:53.892Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:51:53.926Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:52:00.848Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T01:52:19.219Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_18_51_43-762043860750894243 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2207.792s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_51_42-1582866737970161509?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_01_20-6562548019366428203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_11_02-2304313916682389790?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_20_00-16485632374597724224?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_51_43-762043860750894243?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_59_59-10422521249144478606?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_09_09-7441307787405889083?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_51_45-14206145900490870704?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_01_37-16508363069426232968?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_10_56-9821157370893407408?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_51_43-16090570185263202626?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_01_54-4182688336307577884?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_51_44-17280329235368673618?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_00_42-6854270253982076636?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_10_48-874730789065848193?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_51_42-7473790098377774418?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_01_05-11829718147636029717?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_10_12-4744648451134788614?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_51_44-9345118461684386919?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_00_19-16772610522639784519?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_09_55-7764308298319767649?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_18_51_43-2871459826556226073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_01_14-2477016896746210274?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_19_10_36-13402989106441804136?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 15m 46s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/y2ge3giqd4upc

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #333

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/333/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-9651] Prevent StreamPool and stream initialization livelock


------------------------------------------
[...truncated 5.42 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0410000211-370162.1586476931.370308/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0410000211-370162", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input5c4e94a5-337c-4fc8-ba76-cfd02da5ba9a"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input5c4e94a5-337c-4fc8-ba76-cfd02da5ba9a", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output5c4e94a5-337c-4fc8-ba76-cfd02da5ba9a", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-10T00:02:25.617522Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_17_02_24-3973634596217791966'
 location: u'us-central1'
 name: u'beamapp-jenkins-0410000211-370162'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-10T00:02:25.617522Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_17_02_24-3973634596217791966]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_02_24-3973634596217791966?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_17_02_24-3973634596217791966 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:24.204Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:24.205Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_17_02_24-3973634596217791966.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:24.205Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_17_02_24-3973634596217791966. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:27.701Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:28.487Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:28.967Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:28.999Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.069Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.111Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.143Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.178Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.216Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.269Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.302Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.335Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.377Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.410Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.443Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:29.477Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:36.438Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:36.466Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:36.503Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:02:59.027Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-10T00:03:04.386Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_17_02_24-3973634596217791966 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2267.027s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_02_24-12997562990933708874?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_11_34-12812234567107808421?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_21_11-10014027109272091871?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_31_15-15130870507917600797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_02_22-17233365395623676876?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_11_33-11021088760189317021?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_21_04-7567117561401101696?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_02_24-3973634596217791966?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_10_44-14611779976700774556?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_19_07-3257608669124583285?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_02_24-7543499368120367472?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_11_17-14610073210889988796?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_21_01-8830009466776653120?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_02_25-15709992512924605962?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_11_49-12742086671955432927?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_02_23-16255027362470459595?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_11_09-12503384318813581505?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_20_21-13144407829985731265?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_02_23-11236938094566265615?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_10_27-12506727279188592063?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_19_10-12115686454097943222?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_02_25-13887749885078580953?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_10_22-668378702041899474?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_17_19_41-13274595203222227292?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 52s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/pd6tmsir3ltpm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #332

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/332/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-4374] Fix missing deletion of metrics.


------------------------------------------
[...truncated 5.42 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0409224333-379233.1586472213.379478/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0409224333-379233", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2b918c12-96da-4823-8f5a-8e5629c9e7ea"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input2b918c12-96da-4823-8f5a-8e5629c9e7ea", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output2b918c12-96da-4823-8f5a-8e5629c9e7ea", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-09T22:43:48.924960Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_15_43_47-11199889323274627752'
 location: u'us-central1'
 name: u'beamapp-jenkins-0409224333-379233'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-09T22:43:48.924960Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_15_43_47-11199889323274627752]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_43_47-11199889323274627752?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_15_43_47-11199889323274627752 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:47.991Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:47.991Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_15_43_47-11199889323274627752.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:47.991Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_15_43_47-11199889323274627752. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:51.360Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:52.340Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:52.823Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:52.868Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:52.940Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:52.985Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.027Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.063Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.089Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.203Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.246Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.286Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.332Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.376Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.407Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:43:53.446Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:44:00.825Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:44:00.850Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:44:00.877Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:44:26.302Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T22:44:32.693Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_15_43_47-11199889323274627752 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2347.828s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_43_46-1717984077909731928?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_54_24-16081463988881216360?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_16_04_06-4127297978038311131?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_16_13_12-7412235919693403980?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_43_45-6208351490489225593?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_54_23-10292071489275150653?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_16_04_08-547362749310640307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_43_47-11199889323274627752?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_52_17-372602812806641181?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_16_02_10-15227454302817351390?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_43_47-13086350651831578328?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_54_31-12774120736967692077?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_16_04_05-14968271532272294175?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_43_47-3497471252652374571?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_53_29-17786629273064033834?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_16_03_22-3881171522311635383?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_43_45-16305387961704944311?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_53_35-18324034330423159623?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_16_02_38-1154144994801061006?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_43_47-2334422693061369065?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_54_12-9252332532042049499?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_16_04_08-11046329643715779951?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_43_47-2132995686046240139?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_15_54_15-10149676693587290823?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 27s
64 actionable tasks: 48 executed, 16 from cache

Publishing build scan...
https://gradle.com/s/gaw6os4nbguwq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #331

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/331/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-9726] [py] Make region optional for non-service Dataflow.

[kcweaver] [BEAM-9726] [java] Make region optional for non-service runner.


------------------------------------------
[...truncated 5.44 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0409212529-186567.1586467529.186709/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0409212529-186567", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputc8b8b6d5-542f-4ad5-9adc-1c2288ccb990"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputc8b8b6d5-542f-4ad5-9adc-1c2288ccb990", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputc8b8b6d5-542f-4ad5-9adc-1c2288ccb990", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-09T21:25:44.457214Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_14_25_43-1310891188597603812'
 location: u'us-central1'
 name: u'beamapp-jenkins-0409212529-186567'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-09T21:25:44.457214Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_14_25_43-1310891188597603812]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_25_43-1310891188597603812?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_14_25_43-1310891188597603812 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:43.459Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:43.459Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_14_25_43-1310891188597603812. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:43.460Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_14_25_43-1310891188597603812.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:47.066Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.017Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.512Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.547Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.626Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.664Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.699Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.738Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.771Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.830Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.865Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.901Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.951Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:48.990Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:49.026Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:49.063Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:55.619Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:55.656Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:25:55.697Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:26:15.565Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T21:26:21.777Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_14_25_43-1310891188597603812 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2239.405s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_25_41-4860840910161388180?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_35_42-7279367778698680400?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_45_24-3517752623055706797?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_54_10-17717941981236239916?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_25_41-11731955540864783440?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_34_19-8054220791863631332?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_43_29-3738271119320263336?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_25_43-1310891188597603812?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_34_02-10601750353246692197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_43_54-5559071878329562848?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_25_43-550086456549321890?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_34_07-1023397386615206020?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_45_07-8383889730056634579?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_25_40-12606453439727446975?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_35_32-8079545770085583307?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_44_59-13183101934772512393?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_25_42-15348842031255529986?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_35_39-9326366050559589910?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_25_43-16194854816125733681?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_35_12-15553851880971545023?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_45_04-5943981756030336504?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_25_42-3782419601956926659?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_35_40-16188946817498083766?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_14_45_02-14930714693017866177?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 15m 54s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/gann5cnfmyd2c

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #330

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/330/display/redirect?page=changes>

Changes:

[github] [BEAM-9731] Include more detail in passert.Equals errors. (#11359)

[github] [BEAM-9085] Fix performance regression in SyntheticSource on Python 3


------------------------------------------
[...truncated 5.42 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0409192950-450474.1586460590.450607/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0409192950-450474", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input721f36ca-00af-4433-8aae-8fb8da9d86f5"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input721f36ca-00af-4433-8aae-8fb8da9d86f5", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output721f36ca-00af-4433-8aae-8fb8da9d86f5", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-09T19:30:04.419732Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_12_30_03-13033963412867048034'
 location: u'us-central1'
 name: u'beamapp-jenkins-0409192950-450474'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-09T19:30:04.419732Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_12_30_03-13033963412867048034]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_30_03-13033963412867048034?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_12_30_03-13033963412867048034 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:03.332Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_12_30_03-13033963412867048034.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:03.332Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_12_30_03-13033963412867048034. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:03.332Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:16.393Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.052Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.645Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.671Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.757Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.793Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.831Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.868Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.901Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.960Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:17.998Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:18.028Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:18.072Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:18.109Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:18.143Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:18.178Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:24.521Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:24.558Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:24.595Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:27.882Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T19:30:54.296Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_12_30_03-13033963412867048034 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2302.131s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_30_03-7794741100411741549?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_38_57-10996339017041456583?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_49_16-5070461858023502622?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_59_18-8139880604663160904?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_30_03-13033963412867048034?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_38_42-1664126451644899082?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_48_05-2548507957498368098?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_30_02-11465908331478392296?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_39_49-8935676928318461279?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_30_03-15399189914440096308?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_40_00-14213286203462245813?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_49_08-16465982090758623744?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_30_04-4061483706117095858?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_39_18-2368705937818489052?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_48_26-1306177523069946856?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_30_01-11985034181799542549?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_39_48-13466508315297872865?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_49_06-3150335687306656814?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_30_04-18432895079215653395?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_39_22-1187996827631752935?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_48_24-14169305015335527865?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_30_03-14947897465612959945?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_39_01-7373758993517080937?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_12_48_03-14768345619271642577?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 17m 42s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/yigitapvmw4qi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #329

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/329/display/redirect?page=changes>

Changes:

[github] [BEAM-8280] Document Python 3 annotations support (#11232)


------------------------------------------
[...truncated 5.43 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0409175056-299017.1586454656.299149/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0409175056-299017", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputabe16b1c-9f27-4bcf-a052-0c448352e8b8"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_inputabe16b1c-9f27-4bcf-a052-0c448352e8b8", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_outputabe16b1c-9f27-4bcf-a052-0c448352e8b8", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-09T17:51:14.711635Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_10_51_10-4497026831609467329'
 location: u'us-central1'
 name: u'beamapp-jenkins-0409175056-299017'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-09T17:51:14.711635Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_10_51_10-4497026831609467329]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_51_10-4497026831609467329?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_10_51_10-4497026831609467329 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:51:10.778Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_10_51_10-4497026831609467329.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:51:10.778Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:51:10.778Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_10_51_10-4497026831609467329. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:51:58.967Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:51:59.656Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-c.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.190Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.229Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.327Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.371Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.409Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.444Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.468Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.536Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.576Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.615Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.658Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.693Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.727Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:00.765Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:09.558Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:09.586Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-c...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:09.630Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:23.839Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T17:52:38.044Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_10_51_10-4497026831609467329 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2308.016s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_51_10-4081771133930036954?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_01_16-5887860900750487173?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_10_46-13071923283833447873?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_20_15-2148196705153950146?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_51_10-4497026831609467329?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_00_36-1760171106578379219?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_10_41-12982269168285413398?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_51_10-7541065308201026888?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_00_26-7178466835478631792?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_08_54-8626498553355039002?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_51_11-13293493323451157157?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_00_03-12462565361158397501?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_10_22-2446679655928032191?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_51_07-5773893773607104306?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_59_23-14061460349407072955?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_09_12-13285392356884421426?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_51_11-14422425022405653094?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_00_09-1915759263296710697?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_09_17-13292415936324402226?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_51_12-10058792582542787571?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_02_12-14471629787753400706?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_10_51_09-880040959850680004?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_00_19-12567952602410241635?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_11_09_13-2789751859446851237?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 19m 16s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/x677x2vtvvx7a

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #328

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/328/display/redirect>

Changes:


------------------------------------------
[...truncated 5.42 MB...]
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0409134759-080483.1586440079.080621/dataflow-worker.jar", 
            "name": "dataflow-worker.jar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python-fnapi:beam-master-20200317"
      }
    ]
  }, 
  "name": "beamapp-jenkins-0409134759-080483", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "_PubSubSource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.pubsub._PubSubSource"
          }, 
          {
            "key": "with_attributes", 
            "label": "With Attributes", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "BOOLEAN", 
            "value": false
          }, 
          {
            "key": "subscription", 
            "label": "Pubsub Subscription", 
            "namespace": "apache_beam.io.gcp.pubsub._PubSubSource", 
            "type": "STRING", 
            "value": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input0642ab6d-2132-44e4-8785-081218e935d4"
          }
        ], 
        "format": "pubsub", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Read.out"
          }
        ], 
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input0642ab6d-2132-44e4-8785-081218e935d4", 
        "user_name": "ReadFromPubSub/Read"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s2", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "StreamingUserMetricsDoFn", 
            "type": "STRING", 
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "None", 
            "user_name": "generate_metrics.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4", 
        "user_name": "generate_metrics"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s3", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "None", 
          "step_name": "s2"
        }, 
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output0642ab6d-2132-44e4-8785-081218e935d4", 
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2020-04-09T13:48:17.251323Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2020-04-09_06_48_15-2580013217842299967'
 location: u'us-central1'
 name: u'beamapp-jenkins-0409134759-080483'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2020-04-09T13:48:17.251323Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-04-09_06_48_15-2580013217842299967]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_48_15-2580013217842299967?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-04-09_06_48_15-2580013217842299967 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:15.802Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-09_06_48_15-2580013217842299967. The number of workers will be between 1 and 100.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:15.802Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-09_06_48_15-2580013217842299967.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:15.802Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:20.465Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:21.193Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:21.759Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:21.796Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:21.859Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:21.897Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:21.926Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:22.015Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:22.047Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:22.085Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:22.117Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:22.152Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:22.188Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:22.221Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:22.254Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:22.280Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:40.094Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:40.130Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:40.164Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:48:46.728Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-04-09T13:49:06.709Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-04-09_06_48_15-2580013217842299967 after 60 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 27 tests in 2264.017s

FAILED (failures=1)
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_48_15-13932957090513031637?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_57_19-13324588744991761817?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_07_07_12-8730444959265243094?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_07_15_59-15447213181556426691?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_48_13-8735284956541372405?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_57_56-17681920502983389612?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_07_07_10-17488583631803634173?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_48_15-2580013217842299967?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_56_39-17935382327430281212?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_07_05_02-18113661036341445274?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_48_15-6395434526839020113?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_57_00-10314004466684313952?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_07_07_04-16761688181949715723?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_48_13-5412550135627266338?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_58_11-9833505428755523934?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_07_07_04-10527459880605809806?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_48_15-10191990575837799846?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_57_53-16069259144248122095?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_48_16-11020651497316759246?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_56_55-11868263397333012783?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_07_07_10-381163482683084629?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_48_12-1362888733506617121?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_06_57_18-14227819897287878646?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-09_07_07_06-11849911613818615914?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 142

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 16m 45s
64 actionable tasks: 46 executed, 18 from cache

Publishing build scan...
https://gradle.com/s/lyc7yzdj5b5eq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org