You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/07/19 20:00:31 UTC

beam_PostCommit_Python36 - Build # 4130 - Aborted!

beam_PostCommit_Python36 - Build # 4130 - Aborted:

Check console output at https://ci-beam.apache.org/job/beam_PostCommit_Python36/4130/ to view the results.

Jenkins build is back to normal : beam_PostCommit_Python36 #4133

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4133/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #4132

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4132/display/redirect?page=changes>

Changes:

[noreply] Fix spelling.


------------------------------------------
[...truncated 47.68 MB...]
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "format.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s7"
        },
        "serialized_fn": "ref_AppliedPTransform_format_10",
        "user_name": "format"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s9",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "<lambda>"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "encode.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s8"
        },
        "serialized_fn": "ref_AppliedPTransform_encode_11",
        "user_name": "encode"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s10",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "CallableWrapperDoFn",
            "type": "STRING",
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          },
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
            "type": "STRING",
            "value": "bytes_to_proto_str"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:interval_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "None",
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s9"
        },
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub-ToProtobuf_13",
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s11",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "None",
          "step_name": "s10"
        },
        "pubsub_serialized_attributes_fn": "",
        "pubsub_topic": "projects/apache-beam-testing/topics/wc_topic_output87f228c6-76e7-4565-8fe6-c631fa9318ac",
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2021-07-20T06:17:37.781494Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2021-07-19_23_17_37-17732714687950199656'
 location: 'us-central1'
 name: 'beamapp-jenkins-0720061723-730941'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2021-07-20T06:17:37.781494Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2021-07-19_23_17_37-17732714687950199656]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job: 2021-07-19_23_17_37-17732714687950199656
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-07-19_23_17_37-17732714687950199656?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2021-07-19_23_17_37-17732714687950199656 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:42.412Z: JOB_MESSAGE_BASIC: Worker configuration: e2-standard-4 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.163Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.196Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.268Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.327Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.363Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.430Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.473Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.503Z: JOB_MESSAGE_DETAILED: Fusing consumer decode into ReadFromPubSub/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.534Z: JOB_MESSAGE_DETAILED: Fusing consumer split into decode
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.577Z: JOB_MESSAGE_DETAILED: Fusing consumer pair_with_one into split
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.613Z: JOB_MESSAGE_DETAILED: Fusing consumer WindowInto(WindowIntoFn) into pair_with_one
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.635Z: JOB_MESSAGE_DETAILED: Fusing consumer group/WriteStream into WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.667Z: JOB_MESSAGE_DETAILED: Fusing consumer group/MergeBuckets into group/ReadStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.703Z: JOB_MESSAGE_DETAILED: Fusing consumer count into group/MergeBuckets
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.730Z: JOB_MESSAGE_DETAILED: Fusing consumer format into count
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.752Z: JOB_MESSAGE_DETAILED: Fusing consumer encode into format
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.808Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/ToProtobuf into encode
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.833Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToPubSub/Write/NativeWrite into WriteToPubSub/ToProtobuf
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.874Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.903Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.938Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:43.974Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:44.180Z: JOB_MESSAGE_DEBUG: Executing wait step start23
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:44.248Z: JOB_MESSAGE_BASIC: Executing operation group/ReadStream+group/MergeBuckets+count+format+encode+WriteToPubSub/ToProtobuf+WriteToPubSub/Write/NativeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:44.284Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+decode+split+pair_with_one+WindowInto(WindowIntoFn)+group/WriteStream
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:44.296Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:17:44.319Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:18:16.642Z: JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:18:33.270Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:19:03.825Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-07-20T06:19:03.877Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: WARNING: Timing out on waiting for job 2021-07-19_23_17_37-17732714687950199656 after 361 seconds
google.auth._default: DEBUG: Checking None for explicit credentials as part of auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth process...
google.auth._default: DEBUG: No App Engine library was found so cannot authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token?scopes=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform%2Chttps%3A%2F%2Fwww.googleapis.com%2Fauth%2Fpubsub HTTP/1.1" 200 244
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 74 tests in 5551.133s

FAILED (SKIP=8, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 126

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 36m 59s
216 actionable tasks: 153 executed, 59 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ml2ukh2ieju5q

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #4131

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Python36/4131/display/redirect?page=changes>

Changes:

[noreply] [BEAM-12548] Implement EqualsFloat test helper (#15175)

[noreply] Fix formatting using go fmt (#15189)

[noreply] [BEAM-4152] Add merging strategy for sessions to WindowingStrategy proto

[noreply] [BEAM-12613] Enable Python build tests for Samza (#15169)


------------------------------------------
[...truncated 9.44 MB...]

INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 704066753
}
message: "Pipeline_options: {\'experiments\': [\'beam_fn_api\'], \'setup_file\': \'./setup.py\', \'sdk_location\': \'container\', \'job_endpoint\': \'embed\', \'sdk_worker_parallelism\': \'1\', \'environment_cache_millis\': \'0\'}"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:100"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 709595680
}
message: "Creating state cache with size 0"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/statecache.py:172"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 710243225
}
message: "Creating insecure control channel for localhost:43815."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:181"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 715689659
}
message: "Control channel established."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:189"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 716498851
}
message: "Initializing SDKHarness with unbounded number of workers."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:232"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 718195438
}
message: "Python sdk harness starting."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:145"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 724989175
}
message: "Creating insecure state channel for localhost:38463."
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:878"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 725398778
}
message: "State channel established."
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:885"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 727717399
}
message: "Creating client data channel for localhost:34283"
instruction_id: "bundle_1"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/data_plane.py:685"
thread: "Thread-14"

INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2968-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:root:Running (((((ref_AppliedPTransform_add-points-Impulse_3)+(ref_AppliedPTransform_add-points-FlatMap-lambda-at-core-py-2968-_4))+(ref_AppliedPTransform_add-points-Map-decode-_6))+(ref_AppliedPTransform_Map-get_julia_set_point_color-_7))+(ref_AppliedPTransform_x-coord-key_8))+(x coord/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:root:Running (((((x coord/Read)+(ref_AppliedPTransform_format_10))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WindowInto-WindowIntoFn-_20))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-WriteBundles_21))+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Pair_22))+(WriteToText/Write/WriteImpl/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:root:Running ((WriteToText/Write/WriteImpl/GroupByKey/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-Extract_24))+(ref_PCollection_PCollection_16/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:root:Running ((ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-PreFinalize_25))+(ref_PCollection_PCollection_17/Write)
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:Running (ref_PCollection_PCollection_10/Read)+(ref_AppliedPTransform_WriteToText-Write-WriteImpl-FinalizeWrite_26)
INFO:root:severity: INFO
timestamp {
  seconds: 1626740661
  nanos: 939976453
}
message: "Starting finalize_write threads with num_shards: 1 (skipped: 0), batches: 1, num_threads: 1"
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:303"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740662
  nanos: 49585819
}
message: "Renamed 1 shards in 0.11 seconds."
instruction_id: "bundle_6"
transform_id: "WriteToText/Write/WriteImpl/FinalizeWrite"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/io/filebasedsink.py:348"
thread: "Thread-14"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740662
  nanos: 57771205
}
message: "No more requests from control plane"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:261"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740662
  nanos: 57983398
}
message: "SDK Harness waiting for in-flight requests to complete"
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:262"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740662
  nanos: 58073997
}
message: "Closing all cached grpc data channels."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/data_plane.py:717"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740662
  nanos: 58164119
}
message: "Closing all cached gRPC state handlers."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:897"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740662
  nanos: 59482336
}
message: "Done consuming work."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker.py:274"
thread: "MainThread"

INFO:root:severity: INFO
timestamp {
  seconds: 1626740662
  nanos: 59805154
}
message: "Python sdk harness exiting."
log_location: "/usr/local/lib/python3.6/site-packages/apache_beam/runners/worker/sdk_worker_main.py:147"
thread: "MainThread"

INFO:apache_beam.runners.portability.local_job_service:Successfully completed job in 9.080270767211914 seconds.
INFO:root:Successfully completed job in 9.080270767211914 seconds.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE

> Task :sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch
INFO:apache_beam.runners.worker.worker_pool_main:Listening for workers at localhost:36623
WARNING:root:Make sure that locally built Python SDK docker image has Python 3.6 interpreter.
INFO:root:Default Python SDK image for environment is apache/beam_python3.6_sdk:2.32.0.dev
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function pack_combiners at 0x7ff483ee62f0> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function lift_combiners at 0x7ff483ee6378> ====================
INFO:apache_beam.runners.portability.fn_api_runner.translations:==================== <function sort_stages at 0x7ff483ee6a60> ====================
Traceback (most recent call last):
  File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 94, in <module>
    run()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/examples/wordcount.py",> line 89, in run
    output | 'Write' >> WriteToText(known_args.output)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 585, in __exit__
    self.result = self.run()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/pipeline.py",> line 564, in run
    return self.runner.run_pipeline(self, self._options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py",> line 47, in run_pipeline
    return super(SparkRunner, self).run_pipeline(pipeline, options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 438, in run_pipeline
    job_service_handle = self.create_job_service(options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/portable_runner.py",> line 317, in create_job_service
    return self.create_job_service_handle(server.start(), options)
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 81, in start
    self._endpoint = self._job_server.start()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 106, in start
    cmd, endpoint = self.subprocess_cmd_and_endpoint()
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/job_server.py",> line 150, in subprocess_cmd_and_endpoint
    jar_path = self.local_jar(self.path_to_jar())
  File "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/portability/spark_runner.py",> line 92, in path_to_jar
    self._jar)
ValueError: Unable to parse jar URL "<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/runners/spark/2/job-server/build/libs/beam-runners-spark-job-server-2.32.0-SNAPSHOT.jar".> If using a full URL, make sure the scheme is specified. If using a local file path, make sure the file exists; you may have to first build the job server using `./gradlew runners:spark:2:job-server:shadowJar`.

> Task :sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:java-fn-execution:compileJava'.
> Failed to load cache entry for task ':runners:java-fn-execution:compileJava'

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:portableWordCountSparkRunnerBatch'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 23m 54s
177 actionable tasks: 138 executed, 35 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/ok7wzsizsicby

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org