You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/10/03 06:36:04 UTC

Build failed in Jenkins: beam_PreCommit_Python_Cron #3319

See <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/3319/display/redirect>

Changes:


------------------------------------------
[...truncated 1.44 MB...]
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "encode.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s8"
        },
        "serialized_fn": "ref_AppliedPTransform_encode_11",
        "user_name": "encode"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s10",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s9"
        },
        "pubsub_topic": "projects/apache-beam-testing/topics/wc_topic_output5d81a7d5-a26d-45a9-9ca4-ae880db7c741",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-10-03T06:27:53.891928Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-10-02_23_27_52-9165277199966145147'
 location: 'us-central1'
 name: 'beamapp-jenkins-1003062745-647471'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-10-03T06:27:53.891928Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-10-02_23_27_52-9165277199966145147]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-10-02_23_27_52-9165277199966145147
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-02_23_27_52-9165277199966145147?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-10-02_23_27_28-2937938311624882132 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:28.664Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-10-02_23_27_28-2937938311624882132. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:28.664Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-10-02_23_27_28-2937938311624882132.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:28.664Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:33.565Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.390Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.419Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.509Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.546Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.578Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.629Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.692Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.765Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.795Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.833Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.873Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.904Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.938Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:34.978Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:35.006Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:35.041Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:35.075Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:35.107Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:35.147Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into encode
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:35.186Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:35.221Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:35.244Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:35.278Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:36.389Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:36.420Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:36.458Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:45.551Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:58.772Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.

> Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-10-02_23_27_52-9165277199966145147 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:52.696Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-10-02_23_27_52-9165277199966145147.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:52.696Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-10-02_23_27_52-9165277199966145147. The number of workers will be between 1 and 100.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:52.696Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:58.450Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.272Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.305Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.384Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.426Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.473Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.516Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.601Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.664Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.780Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.825Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.855Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.885Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.912Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.944Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:27:59.980Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:00.010Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:00.044Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:00.079Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:00.114Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into encode
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:00.155Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:00.214Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:00.245Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:00.286Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:01.418Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:01.457Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:01.491Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:26.597Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:34.463Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete

> Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:32.484Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:28:32.517Z: JOB_MESSAGE_DETAILED: Workers have started successfully.

> Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:29:06.444Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-03T06:29:06.496Z: JOB_MESSAGE_DETAILED: Workers have started successfully.

> Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2020-10-02_23_27_28-2937938311624882132 after 360 seconds
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 221

> Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2
WARNING:apache_beam.runners.dataflow.dataflow_runner:Timing out on waiting for job 2020-10-02_23_27_52-9165277199966145147 after 365 seconds
DEBUG:google.auth._default:Checking None for explicit credentials as part of auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth process...
DEBUG:google.auth._default:No App Engine library was found so cannot authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
DEBUG:google.auth.transport.requests:Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 221

> Task :sdks:python:test-suites:dataflow:py36:preCommitIT_streaming_V2
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-02_23_27_28-2937938311624882132?project=apache-beam-testing
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok

----------------------------------------------------------------------
XML: nosetests-preCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 490.251s

OK

> Task :sdks:python:test-suites:dataflow:py37:preCommitIT_streaming_V2
Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-02_23_27_52-9165277199966145147?project=apache-beam-testing
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok

----------------------------------------------------------------------
XML: nosetests-preCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 498.792s

OK

> Task :sdks:python:test-suites:dataflow:preCommitIT_V2

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:tox:py36:testPy36Cloud'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 35m 33s
82 actionable tasks: 60 executed, 22 from cache

Publishing build scan...
https://gradle.com/s/tsdkd3j6qkhs2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PreCommit_Python_Cron #3320

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/3320/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org