You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/12/16 01:24:30 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #3296

See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3296/display/redirect?page=changes>

Changes:

[samuelw] [BEAM-11401][BEAM-11366] Change ReaderCache invalidations to be

[Boyuan Zhang] Enable more tests on Java + Python FnRunner

[zyichi] Bump up python container versions

[noreply] Do not add unnecessary experiment use_multiple_sdk_containers. (#13475)

[zyichi] Skip dynamic timer test in portable spark test

[noreply] [BEAM-11360] Updates Dataflow Python multi-language pipelines to use

[noreply] Revert "Do not add unnecessary experiment use_multiple_sdk_containers."


------------------------------------------
[...truncated 40.19 MB...]
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_ReadFromPubSub/Map(_from_proto_str)_4",
        "user_name": "ReadFromPubSub/Map(_from_proto_str)"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s3",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "add_attribute"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "add_attribute.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s2"
        },
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5",
        "user_name": "add_attribute"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s4",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "message_to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s3"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s5",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s4"
        },
        "pubsub_id_label": "id",
        "pubsub_serialized_attributes_fn": "",
        "pubsub_timestamp_label": "timestamp",
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output77b73f3c-506c-4762-ae7e-161ea55a39f4",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2020-12-16T01:05:30.196453Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-12-15_17_05_28-13406386751005037426'
 location: 'us-central1'
 name: 'beamapp-jenkins-1216010519-858911'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-12-16T01:05:30.196453Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-12-15_17_05_28-13406386751005037426]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 2020-12-15_17_05_28-13406386751005037426
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-12-15_17_05_28-13406386751005037426?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-12-15_17_05_28-13406386751005037426 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.101Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.819Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.822Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.831Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.841Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.843Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.846Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.857Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.859Z: JOB_MESSAGE_DETAILED: Fusing consumer ReadFromPubSub/Map(_from_proto_str) into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.861Z: JOB_MESSAGE_DETAILED: Fusing consumer add_attribute into ReadFromPubSub/Map(_from_proto_str)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.864Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into add_attribute
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.866Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into WriteToPubSub/ToProtobuf
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.874Z: JOB_MESSAGE_BASIC: The pubsub read for: projects/apache-beam-testing/subscriptions/psit_subscription_input77b73f3c-506c-4762-ae7e-161ea55a39f4 is configured to compute input data watermarks based on custom timestamp attribute timestamp. Cloud Dataflow has created an additional tracking subscription to do this, which will be cleaned up automatically. For details, see: https://cloud.google.com/dataflow/model/pubsub-io#timestamps-ids
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.877Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.928Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:33.978Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:34.182Z: JOB_MESSAGE_DEBUG: Executing wait step start2
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:34.195Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:34.200Z: JOB_MESSAGE_BASIC: Starting 1 workers...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:35.395Z: JOB_MESSAGE_DETAILED: Pub/Sub resources set up for topic 'projects/apache-beam-testing/topics/psit_topic_input77b73f3c-506c-4762-ae7e-161ea55a39f4'.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:37.581Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+ReadFromPubSub/Map(_from_proto_str)+add_attribute+WriteToPubSub/ToProtobuf+WriteToPubSub/Write/NativeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:05:50.405Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:06:09.189Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:06:09.661Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-12-16T01:06:36.290Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
oauth2client.transport: INFO: Refreshing due to a 401 (attempt 1/2)
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2020-12-15_17_05_28-13406386751005037426 after 183 seconds
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub HTTP/1.1" 200 241
apache_beam.io.gcp.tests.pubsub_matcher: ERROR: Timeout after 300 sec. Received 0 messages from projects/apache-beam-testing/subscriptions/psit_subscription_output77b73f3c-506c-4762-ae7e-161ea55a39f4.
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 68 tests in 4463.758s

FAILED (SKIP=7, failures=2)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 118

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.7/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 24m 2s
208 actionable tasks: 151 executed, 53 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches limit is too low.

Publishing build scan...
https://gradle.com/s/txby72di7iwdi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #3297

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/3297/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org