You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/11/13 19:44:10 UTC

Build failed in Jenkins: beam_PostCommit_Python36 #985

See <https://builds.apache.org/job/beam_PostCommit_Python36/985/display/redirect?page=changes>

Changes:

[kcweaver] [BEAM-8628] use mock GcsUtil in testDefaultGcpTempLocationDoesNotExist

[mxm] [BEAM-8622] Exclude UsesStrictTimerOrdering PVR tests for Spark

[aaltay] Remove Interactive Test Suite (#10068)


------------------------------------------
[...truncated 615.41 KB...]
            },
            "output_name": "out",
            "user_name": "ReadFromPubSub/Read.out"
          }
        ],
        "pubsub_subscription": "projects/apache-beam-testing/subscriptions/exercise_streaming_metrics_subscription_input007d449e-cbf7-4112-bfa6-26b1c4e53bd4",
        "user_name": "ReadFromPubSub/Read"
      }
    },
    {
      "kind": "ParallelDo",
      "name": "s2",
      "properties": {
        "display_data": [
          {
            "key": "fn",
            "label": "Transform Function",
            "namespace": "apache_beam.transforms.core.ParDo",
            "shortValue": "StreamingUserMetricsDoFn",
            "type": "STRING",
            "value": "apache_beam.runners.dataflow.dataflow_exercise_streaming_metrics_pipeline.StreamingUserMetricsDoFn"
          }
        ],
        "non_parallel_inputs": {},
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value",
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "generate_metrics.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s1"
        },
        "serialized_fn": "ref_AppliedPTransform_generate_metrics_4",
        "user_name": "generate_metrics"
      }
    },
    {
      "kind": "ParallelWrite",
      "name": "s3",
      "properties": {
        "display_data": [],
        "encoding": {
          "@type": "kind:windowed_value",
          "component_encodings": [
            {
              "@type": "kind:bytes"
            },
            {
              "@type": "kind:global_window"
            }
          ],
          "is_wrapper": true
        },
        "format": "pubsub",
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s2"
        },
        "pubsub_topic": "projects/apache-beam-testing/topics/exercise_streaming_metrics_topic_output007d449e-cbf7-4112-bfa6-26b1c4e53bd4",
        "user_name": "dump_to_pub/Write/NativeWrite"
      }
    }
  ],
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: '2019-11-13T19:28:34.684593Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-11-13_11_28_33-12938919104804468105'
 location: 'us-central1'
 name: 'beamapp-jenkins-1113192821-768840'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-11-13T19:28:34.684593Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2019-11-13_11_28_33-12938919104804468105]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_28_33-12938919104804468105?project=apache-beam-testing
root: INFO: Job 2019-11-13_11_28_33-12938919104804468105 is in state JOB_STATE_RUNNING
root: INFO: 2019-11-13T19:28:36.955Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-11-13T19:28:37.732Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
root: INFO: 2019-11-13T19:28:38.288Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
root: INFO: 2019-11-13T19:28:38.290Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
root: INFO: 2019-11-13T19:28:38.297Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-11-13T19:28:38.304Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts.
root: INFO: 2019-11-13T19:28:38.306Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps
root: INFO: 2019-11-13T19:28:38.308Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-11-13T19:28:38.318Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-11-13T19:28:38.320Z: JOB_MESSAGE_DETAILED: Fusing consumer generate_metrics into ReadFromPubSub/Read
root: INFO: 2019-11-13T19:28:38.321Z: JOB_MESSAGE_DETAILED: Fusing consumer dump_to_pub/Write/NativeWrite into generate_metrics
root: INFO: 2019-11-13T19:28:38.328Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-11-13T19:28:38.353Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-11-13T19:28:38.362Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-11-13T19:28:38.510Z: JOB_MESSAGE_DEBUG: Executing wait step start2
root: INFO: 2019-11-13T19:28:38.521Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-11-13T19:28:38.525Z: JOB_MESSAGE_BASIC: Starting 1 workers...
root: INFO: 2019-11-13T19:28:42.598Z: JOB_MESSAGE_BASIC: Executing operation ReadFromPubSub/Read+generate_metrics+dump_to_pub/Write/NativeWrite
root: INFO: 2019-11-13T19:29:10.154Z: JOB_MESSAGE_DEBUG: Executing input step topology_init_attach_disk_input_step
root: INFO: 2019-11-13T19:29:10.156Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-11-13T19:29:11.155Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-4 in us-central1-f.
root: INFO: 2019-11-13T19:29:13.423Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
root: INFO: 2019-11-13T19:29:25.912Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: WARNING: Timing out on waiting for job 2019-11-13_11_28_33-12938919104804468105 after 61 seconds
google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144
google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token HTTP/1.1" 200 181
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_10_52_17-15292469279914964806?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_07_44-16117073418458949873?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_15_08-7305916032501682219?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_23_32-2244703510762032057?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_31_32-10490277261022924480?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_10_52_12-9048107417466294207?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_16_07-167934407156935520?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_25_08-10404226066742233259?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_33_51-7611488923773713373?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_10_52_17-11003089132872653595?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_04_39-11507996035653824548?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_13_43-335788556707122192?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_10_52_12-8102718812912370476?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_11_04-10399788494373760353?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_19_51-5188755613179275122?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_28_33-12938919104804468105?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_10_52_14-8686825487028897754?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_01_25-1256172739939079890?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_09_18-7229614349422990073?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_18_05-2574541652985166681?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_26_02-10908160426529042188?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_10_52_13-661073589038289257?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_00_54-4978806373904252314?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_09_36-12204654271561847701?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_18_39-1311968189050367031?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_26_29-3025328507097156170?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_10_52_21-5937196045614376133?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_01_01-16288234062364555731?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_09_41-6257159476578563386?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_18_00-447324256249789274?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_27_23-4638525053108304390?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_35_18-9298745130601685215?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_10_52_16-15310016011748644147?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_01_49-4959987224178043756?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_11_49-12500635777192581483?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_19_57-16051478430725807075?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_11_29_56-17066806887714024099?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3133.500s

FAILED (SKIP=6, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 20s
82 actionable tasks: 61 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/t76lgu4cukj7m

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python36 #992

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/992/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


beam_PostCommit_Python36 - Build # 991 - Aborted

Posted by Apache Jenkins Server <je...@builds.apache.org>.
The Apache Jenkins build system has built beam_PostCommit_Python36 (build #991)

Status: Aborted

Check console output at https://builds.apache.org/job/beam_PostCommit_Python36/991/ to view the results.

Build failed in Jenkins: beam_PostCommit_Python36 #990

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/990/display/redirect>

Changes:


------------------------------------------
[...truncated 555.33 KB...]
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_5 stored as values in memory (estimated size 22.8 KB, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 9.9 KB, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_5_piece0 in memory on localhost:39845 (size: 9.9 KB, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 5 from broadcast at DAGScheduler.scala:1161
19/11/14 01:19:29 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 4 (MapPartitionsRDD[33] at map at BoundedDataset.java:75) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
19/11/14 01:19:29 INFO TaskSchedulerImpl: Adding task set 4.0 with 4 tasks
19/11/14 01:19:29 INFO TaskSetManager: Starting task 0.0 in stage 4.0 (TID 16, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 1.0 in stage 4.0 (TID 17, localhost, executor driver, partition 1, PROCESS_LOCAL, 7662 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 2.0 in stage 4.0 (TID 18, localhost, executor driver, partition 2, PROCESS_LOCAL, 7662 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 3.0 in stage 4.0 (TID 19, localhost, executor driver, partition 3, PROCESS_LOCAL, 7662 bytes)
19/11/14 01:19:29 INFO Executor: Running task 0.0 in stage 4.0 (TID 16)
19/11/14 01:19:29 INFO Executor: Running task 2.0 in stage 4.0 (TID 18)
19/11/14 01:19:29 INFO Executor: Running task 1.0 in stage 4.0 (TID 17)
19/11/14 01:19:29 INFO Executor: Running task 3.0 in stage 4.0 (TID 19)
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Getting 4 non-empty blocks including 4 local blocks and 0 remote blocks
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/14 01:19:29 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/14 01:19:29 INFO MemoryStore: Block rdd_32_3 stored as values in memory (estimated size 16.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added rdd_32_3 in memory on localhost:39845 (size: 16.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO Executor: Finished task 3.0 in stage 4.0 (TID 19). 10453 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 3.0 in stage 4.0 (TID 19) in 44 ms on localhost (executor driver) (1/4)
19/11/14 01:19:29 INFO MemoryStore: Block rdd_32_1 stored as values in memory (estimated size 16.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block rdd_32_2 stored as values in memory (estimated size 16.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added rdd_32_1 in memory on localhost:39845 (size: 16.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added rdd_32_2 in memory on localhost:39845 (size: 16.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block rdd_32_0 stored as values in memory (estimated size 920.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added rdd_32_0 in memory on localhost:39845 (size: 920.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO Executor: Finished task 2.0 in stage 4.0 (TID 18). 10453 bytes result sent to driver
19/11/14 01:19:29 INFO Executor: Finished task 1.0 in stage 4.0 (TID 17). 10453 bytes result sent to driver
19/11/14 01:19:29 INFO Executor: Finished task 0.0 in stage 4.0 (TID 16). 11065 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 2.0 in stage 4.0 (TID 18) in 63 ms on localhost (executor driver) (2/4)
19/11/14 01:19:29 INFO TaskSetManager: Finished task 1.0 in stage 4.0 (TID 17) in 65 ms on localhost (executor driver) (3/4)
19/11/14 01:19:29 INFO TaskSetManager: Finished task 0.0 in stage 4.0 (TID 16) in 65 ms on localhost (executor driver) (4/4)
19/11/14 01:19:29 INFO TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool 
19/11/14 01:19:29 INFO DAGScheduler: ResultStage 4 (collect at BoundedDataset.java:76) finished in 0.074 s
19/11/14 01:19:29 INFO DAGScheduler: Job 1 finished: collect at BoundedDataset.java:76, took 0.605815 s
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_6 stored as values in memory (estimated size 832.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 980.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_6_piece0 in memory on localhost:39845 (size: 980.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 6 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_7 stored as values in memory (estimated size 288.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 802.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_7_piece0 in memory on localhost:39845 (size: 802.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 7 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_8 stored as values in memory (estimated size 288.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 802.0 B, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_8_piece0 in memory on localhost:39845 (size: 802.0 B, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 8 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:29 INFO SparkContext: Starting job: collect at BoundedDataset.java:76
19/11/14 01:19:29 INFO DAGScheduler: Got job 2 (collect at BoundedDataset.java:76) with 4 output partitions
19/11/14 01:19:29 INFO DAGScheduler: Final stage: ResultStage 5 (collect at BoundedDataset.java:76)
19/11/14 01:19:29 INFO DAGScheduler: Parents of final stage: List()
19/11/14 01:19:29 INFO DAGScheduler: Missing parents: List()
19/11/14 01:19:29 INFO DAGScheduler: Submitting ResultStage 5 (MapPartitionsRDD[36] at map at BoundedDataset.java:75), which has no missing parents
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_9 stored as values in memory (estimated size 30.9 KB, free 13.5 GB)
19/11/14 01:19:29 INFO MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/14 01:19:29 INFO BlockManagerInfo: Added broadcast_9_piece0 in memory on localhost:39845 (size: 12.4 KB, free: 13.5 GB)
19/11/14 01:19:29 INFO SparkContext: Created broadcast 9 from broadcast at DAGScheduler.scala:1161
19/11/14 01:19:29 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 5 (MapPartitionsRDD[36] at map at BoundedDataset.java:75) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
19/11/14 01:19:29 INFO TaskSchedulerImpl: Adding task set 5.0 with 4 tasks
19/11/14 01:19:29 INFO TaskSetManager: Starting task 0.0 in stage 5.0 (TID 20, localhost, executor driver, partition 0, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 1.0 in stage 5.0 (TID 21, localhost, executor driver, partition 1, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 2.0 in stage 5.0 (TID 22, localhost, executor driver, partition 2, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:29 INFO TaskSetManager: Starting task 3.0 in stage 5.0 (TID 23, localhost, executor driver, partition 3, PROCESS_LOCAL, 7879 bytes)
19/11/14 01:19:29 INFO Executor: Running task 1.0 in stage 5.0 (TID 21)
19/11/14 01:19:29 INFO Executor: Running task 3.0 in stage 5.0 (TID 23)
19/11/14 01:19:29 INFO Executor: Running task 0.0 in stage 5.0 (TID 20)
19/11/14 01:19:29 INFO Executor: Running task 2.0 in stage 5.0 (TID 22)
19/11/14 01:19:29 INFO BlockManager: Found block rdd_19_0 locally
19/11/14 01:19:29 INFO BlockManager: Found block rdd_19_1 locally
19/11/14 01:19:29 INFO BlockManager: Found block rdd_19_3 locally
19/11/14 01:19:29 INFO BlockManager: Found block rdd_19_2 locally
19/11/14 01:19:29 INFO Executor: Finished task 1.0 in stage 5.0 (TID 21). 10082 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 1.0 in stage 5.0 (TID 21) in 35 ms on localhost (executor driver) (1/4)
19/11/14 01:19:29 INFO Executor: Finished task 2.0 in stage 5.0 (TID 22). 10082 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 2.0 in stage 5.0 (TID 22) in 43 ms on localhost (executor driver) (2/4)
19/11/14 01:19:29 INFO Executor: Finished task 0.0 in stage 5.0 (TID 20). 10082 bytes result sent to driver
19/11/14 01:19:29 INFO TaskSetManager: Finished task 0.0 in stage 5.0 (TID 20) in 53 ms on localhost (executor driver) (3/4)
WARNING:root:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/11/14 01:19:40 INFO Executor: Finished task 3.0 in stage 5.0 (TID 23). 10125 bytes result sent to driver
19/11/14 01:19:40 INFO TaskSetManager: Finished task 3.0 in stage 5.0 (TID 23) in 11028 ms on localhost (executor driver) (4/4)
19/11/14 01:19:40 INFO TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool 
19/11/14 01:19:40 INFO DAGScheduler: ResultStage 5 (collect at BoundedDataset.java:76) finished in 11.035 s
19/11/14 01:19:40 INFO DAGScheduler: Job 2 finished: collect at BoundedDataset.java:76, took 11.040103 s
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_10 stored as values in memory (estimated size 176.0 B, free 13.5 GB)
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_10_piece0 stored as bytes in memory (estimated size 702.0 B, free 13.5 GB)
19/11/14 01:19:40 INFO BlockManagerInfo: Added broadcast_10_piece0 in memory on localhost:39845 (size: 702.0 B, free: 13.5 GB)
19/11/14 01:19:40 INFO SparkContext: Created broadcast 10 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_11 stored as values in memory (estimated size 832.0 B, free 13.5 GB)
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_11_piece0 stored as bytes in memory (estimated size 980.0 B, free 13.5 GB)
19/11/14 01:19:40 INFO BlockManagerInfo: Added broadcast_11_piece0 in memory on localhost:39845 (size: 980.0 B, free: 13.5 GB)
19/11/14 01:19:40 INFO SparkContext: Created broadcast 11 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/14 01:19:40 INFO SparkPipelineRunner: Job BeamApp-jenkins-1114011924-c3f8f09e_c80d6b1b-60c0-46e9-8e73-527018a8319f: Pipeline translated successfully. Computing outputs
19/11/14 01:19:40 INFO SparkContext: Starting job: foreach at BoundedDataset.java:124
19/11/14 01:19:40 INFO DAGScheduler: Got job 3 (foreach at BoundedDataset.java:124) with 4 output partitions
19/11/14 01:19:40 INFO DAGScheduler: Final stage: ResultStage 6 (foreach at BoundedDataset.java:124)
19/11/14 01:19:40 INFO DAGScheduler: Parents of final stage: List()
19/11/14 01:19:40 INFO DAGScheduler: Missing parents: List()
19/11/14 01:19:40 INFO DAGScheduler: Submitting ResultStage 6 (EmptyOutputSink_0 MapPartitionsRDD[38] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_12 stored as values in memory (estimated size 32.8 KB, free 13.5 GB)
19/11/14 01:19:40 INFO MemoryStore: Block broadcast_12_piece0 stored as bytes in memory (estimated size 12.9 KB, free 13.5 GB)
19/11/14 01:19:40 INFO BlockManagerInfo: Added broadcast_12_piece0 in memory on localhost:39845 (size: 12.9 KB, free: 13.5 GB)
19/11/14 01:19:40 INFO SparkContext: Created broadcast 12 from broadcast at DAGScheduler.scala:1161
19/11/14 01:19:40 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 6 (EmptyOutputSink_0 MapPartitionsRDD[38] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
19/11/14 01:19:40 INFO TaskSchedulerImpl: Adding task set 6.0 with 4 tasks
19/11/14 01:19:40 INFO TaskSetManager: Starting task 0.0 in stage 6.0 (TID 24, localhost, executor driver, partition 0, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:40 INFO TaskSetManager: Starting task 1.0 in stage 6.0 (TID 25, localhost, executor driver, partition 1, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:40 INFO TaskSetManager: Starting task 2.0 in stage 6.0 (TID 26, localhost, executor driver, partition 2, PROCESS_LOCAL, 7868 bytes)
19/11/14 01:19:40 INFO TaskSetManager: Starting task 3.0 in stage 6.0 (TID 27, localhost, executor driver, partition 3, PROCESS_LOCAL, 7879 bytes)
19/11/14 01:19:40 INFO Executor: Running task 0.0 in stage 6.0 (TID 24)
19/11/14 01:19:40 INFO Executor: Running task 2.0 in stage 6.0 (TID 26)
19/11/14 01:19:40 INFO Executor: Running task 3.0 in stage 6.0 (TID 27)
19/11/14 01:19:40 INFO Executor: Running task 1.0 in stage 6.0 (TID 25)
19/11/14 01:19:40 INFO BlockManager: Found block rdd_19_2 locally
19/11/14 01:19:40 INFO BlockManager: Found block rdd_19_0 locally
19/11/14 01:19:40 INFO BlockManager: Found block rdd_19_3 locally
19/11/14 01:19:40 INFO BlockManager: Found block rdd_19_1 locally
19/11/14 01:19:40 INFO Executor: Finished task 0.0 in stage 6.0 (TID 24). 9341 bytes result sent to driver
19/11/14 01:19:40 INFO TaskSetManager: Finished task 0.0 in stage 6.0 (TID 24) in 39 ms on localhost (executor driver) (1/4)
19/11/14 01:19:40 INFO Executor: Finished task 1.0 in stage 6.0 (TID 25). 9341 bytes result sent to driver
19/11/14 01:19:40 INFO TaskSetManager: Finished task 1.0 in stage 6.0 (TID 25) in 53 ms on localhost (executor driver) (2/4)
19/11/14 01:19:40 INFO Executor: Finished task 2.0 in stage 6.0 (TID 26). 9341 bytes result sent to driver
19/11/14 01:19:40 INFO TaskSetManager: Finished task 2.0 in stage 6.0 (TID 26) in 58 ms on localhost (executor driver) (3/4)
INFO:root:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:root:Renamed 4 shards in 0.10 seconds.
19/11/14 01:19:51 INFO Executor: Finished task 3.0 in stage 6.0 (TID 27). 9384 bytes result sent to driver
19/11/14 01:19:51 INFO TaskSetManager: Finished task 3.0 in stage 6.0 (TID 27) in 11154 ms on localhost (executor driver) (4/4)
19/11/14 01:19:51 INFO TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool 
19/11/14 01:19:51 INFO DAGScheduler: ResultStage 6 (foreach at BoundedDataset.java:124) finished in 11.163 s
19/11/14 01:19:51 INFO DAGScheduler: Job 3 finished: foreach at BoundedDataset.java:124, took 11.166911 s
19/11/14 01:19:51 INFO SparkPipelineRunner: Job BeamApp-jenkins-1114011924-c3f8f09e_c80d6b1b-60c0-46e9-8e73-527018a8319f finished.
19/11/14 01:19:51 INFO SparkUI: Stopped Spark web UI at http://localhost:4040
19/11/14 01:19:51 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/11/14 01:19:51 INFO MemoryStore: MemoryStore cleared
19/11/14 01:19:51 INFO BlockManager: BlockManager stopped
19/11/14 01:19:51 INFO BlockManagerMaster: BlockManagerMaster stopped
19/11/14 01:19:51 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/11/14 01:19:51 INFO SparkContext: Successfully stopped SparkContext
19/11/14 01:19:51 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/14 01:19:51 INFO AbstractArtifactRetrievalService: Manifest at /tmp/beam-temp6bkwvgdy/artifacts68a0sk7c/job_e864a608-ff83-4b7a-96b0-53b39cd9be88/MANIFEST has 1 artifact locations
19/11/14 01:19:51 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-temp6bkwvgdy/artifacts68a0sk7c/job_e864a608-ff83-4b7a-96b0-53b39cd9be88/
INFO:root:Job state changed to DONE
19/11/14 01:19:51 INFO InMemoryJobService: Getting job metrics for BeamApp-jenkins-1114011924-c3f8f09e_c80d6b1b-60c0-46e9-8e73-527018a8319f
19/11/14 01:19:51 INFO InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1114011924-c3f8f09e_c80d6b1b-60c0-46e9-8e73-527018a8319f
19/11/14 01:19:51 INFO ShutdownHookManager: Shutdown hook called
19/11/14 01:19:51 INFO ShutdownHookManager: Deleting directory /tmp/spark-b8df6111-ae9d-4fc3-b3c7-81e79e897a2d
Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 607, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573694392.004834822","description":"Error received from peer ipv4:127.0.0.1:40381","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 148, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573694392.004844614","description":"Error received from peer ipv4:127.0.0.1:46427","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:root:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573694392.004795874","description":"Error received from peer ipv4:127.0.0.1:41341","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573694392.004795874","description":"Error received from peer ipv4:127.0.0.1:41341","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py36:postCommitPy36

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 12s
82 actionable tasks: 62 executed, 20 from cache

Publishing build scan...
https://gradle.com/s/bgl22uhh5pvpm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #989

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/989/display/redirect?page=changes>

Changes:

[kirillkozlov] Created a MongoDbTable and a provider for it

[kirillkozlov] [SQL] Implemented write functionality for MongoDbTable, updated

[kirillkozlov] spotlesApply

[kirillkozlov] ToJson should support logical types

[kirillkozlov] Added RowJsonTest for lofical types


------------------------------------------
[...truncated 559.40 KB...]
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573689094.758980017","description":"Error received from peer ipv4:127.0.0.1:45899","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 607, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573689094.758942413","description":"Error received from peer ipv4:127.0.0.1:39907","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py36:postCommitPy36

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_18-8487417414048868764?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_04_29-16431337957815962176?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_12_10-76926037268749475?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_19_45-15831289352896402195?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_27_18-805721596976429951?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_16-12503179732977812714?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_13_34-16502012432304503848?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_21_51-9970733606742144755?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_29_47-3205480505732144348?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_17-17487171326934125434?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_02_53-3097380415094331187?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_11_22-3607881339517771253?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_14-13818487076606140604?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_09_18-17309275289632105263?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_18_21-13709422992952475401?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_26_07-10239321693077747784?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_14-4544109910891428035?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_58_43-13785548934713492743?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_08_56-17041015389770448863?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_18_43-15785811389211781364?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_27_55-8814529213526684042?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_14-12349609192755690759?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_59_30-17760500035841430600?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_07_34-5175807771389470595?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_16_31-11445525652022241929?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_25_06-18217259726142285840?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_34_16-6393408866674417789?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_16-3930703230832291457?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_59_34-14358804236747782894?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_08_10-14550330959125886823?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_16_27-982999766609357360?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_24_39-11983473404674629501?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_50_15-8205326846623720335?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_15_59_51-9204801903114631675?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_09_29-13301100686002300133?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_18_46-4594808160034192914?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_16_28_20-8341259082287408629?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3151.066s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 55m 0s
81 actionable tasks: 75 executed, 6 from cache

Publishing build scan...
https://gradle.com/s/wuxp2vwavf5w4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #988

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/988/display/redirect?page=changes>

Changes:

[valentyn] Update container image tags used by Dataflow runner for Beam master

[lukasz.gajowy] Revert "Merge pull request #10072: [BEAM-8616] Make hadoop-client a


------------------------------------------
[...truncated 541.90 KB...]
19/11/13 23:01:59 INFO DAGScheduler: running: Set()
19/11/13 23:01:59 INFO DAGScheduler: waiting: Set(ResultStage 4)
19/11/13 23:01:59 INFO DAGScheduler: failed: Set()
19/11/13 23:01:59 INFO DAGScheduler: Submitting ResultStage 4 (MapPartitionsRDD[33] at map at BoundedDataset.java:75), which has no missing parents
19/11/13 23:01:59 INFO MemoryStore: Block broadcast_6 stored as values in memory (estimated size 22.8 KB, free 13.5 GB)
19/11/13 23:01:59 INFO MemoryStore: Block broadcast_6_piece0 stored as bytes in memory (estimated size 9.9 KB, free 13.5 GB)
19/11/13 23:01:59 INFO BlockManagerInfo: Added broadcast_6_piece0 in memory on localhost:46531 (size: 9.9 KB, free: 13.5 GB)
19/11/13 23:01:59 INFO SparkContext: Created broadcast 6 from broadcast at DAGScheduler.scala:1161
19/11/13 23:01:59 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 4 (MapPartitionsRDD[33] at map at BoundedDataset.java:75) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
19/11/13 23:01:59 INFO TaskSchedulerImpl: Adding task set 4.0 with 4 tasks
19/11/13 23:01:59 INFO TaskSetManager: Starting task 0.0 in stage 4.0 (TID 16, localhost, executor driver, partition 0, NODE_LOCAL, 7662 bytes)
19/11/13 23:01:59 INFO TaskSetManager: Starting task 1.0 in stage 4.0 (TID 17, localhost, executor driver, partition 1, PROCESS_LOCAL, 7662 bytes)
19/11/13 23:01:59 INFO TaskSetManager: Starting task 2.0 in stage 4.0 (TID 18, localhost, executor driver, partition 2, PROCESS_LOCAL, 7662 bytes)
19/11/13 23:01:59 INFO TaskSetManager: Starting task 3.0 in stage 4.0 (TID 19, localhost, executor driver, partition 3, PROCESS_LOCAL, 7662 bytes)
19/11/13 23:01:59 INFO Executor: Running task 0.0 in stage 4.0 (TID 16)
19/11/13 23:01:59 INFO Executor: Running task 2.0 in stage 4.0 (TID 18)
19/11/13 23:01:59 INFO Executor: Running task 3.0 in stage 4.0 (TID 19)
19/11/13 23:01:59 INFO Executor: Running task 1.0 in stage 4.0 (TID 17)
19/11/13 23:01:59 INFO ShuffleBlockFetcherIterator: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
19/11/13 23:01:59 INFO ShuffleBlockFetcherIterator: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
19/11/13 23:01:59 INFO ShuffleBlockFetcherIterator: Getting 0 non-empty blocks including 0 local blocks and 0 remote blocks
19/11/13 23:01:59 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
19/11/13 23:01:59 INFO ShuffleBlockFetcherIterator: Getting 4 non-empty blocks including 4 local blocks and 0 remote blocks
19/11/13 23:01:59 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/13 23:01:59 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/13 23:01:59 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
19/11/13 23:01:59 INFO MemoryStore: Block rdd_32_2 stored as values in memory (estimated size 16.0 B, free 13.5 GB)
19/11/13 23:01:59 INFO BlockManagerInfo: Added rdd_32_2 in memory on localhost:46531 (size: 16.0 B, free: 13.5 GB)
19/11/13 23:01:59 INFO Executor: Finished task 2.0 in stage 4.0 (TID 18). 10453 bytes result sent to driver
19/11/13 23:01:59 INFO TaskSetManager: Finished task 2.0 in stage 4.0 (TID 18) in 40 ms on localhost (executor driver) (1/4)
19/11/13 23:01:59 INFO MemoryStore: Block rdd_32_3 stored as values in memory (estimated size 16.0 B, free 13.5 GB)
19/11/13 23:01:59 INFO BlockManagerInfo: Added rdd_32_3 in memory on localhost:46531 (size: 16.0 B, free: 13.5 GB)
19/11/13 23:01:59 INFO Executor: Finished task 3.0 in stage 4.0 (TID 19). 10453 bytes result sent to driver
19/11/13 23:01:59 INFO TaskSetManager: Finished task 3.0 in stage 4.0 (TID 19) in 50 ms on localhost (executor driver) (2/4)
19/11/13 23:01:59 INFO MemoryStore: Block rdd_32_1 stored as values in memory (estimated size 16.0 B, free 13.5 GB)
19/11/13 23:01:59 INFO BlockManagerInfo: Added rdd_32_1 in memory on localhost:46531 (size: 16.0 B, free: 13.5 GB)
19/11/13 23:01:59 INFO Executor: Finished task 1.0 in stage 4.0 (TID 17). 10453 bytes result sent to driver
19/11/13 23:01:59 INFO TaskSetManager: Finished task 1.0 in stage 4.0 (TID 17) in 67 ms on localhost (executor driver) (3/4)
19/11/13 23:01:59 INFO MemoryStore: Block rdd_32_0 stored as values in memory (estimated size 920.0 B, free 13.5 GB)
19/11/13 23:01:59 INFO BlockManagerInfo: Added rdd_32_0 in memory on localhost:46531 (size: 920.0 B, free: 13.5 GB)
19/11/13 23:01:59 INFO Executor: Finished task 0.0 in stage 4.0 (TID 16). 11065 bytes result sent to driver
19/11/13 23:01:59 INFO TaskSetManager: Finished task 0.0 in stage 4.0 (TID 16) in 78 ms on localhost (executor driver) (4/4)
19/11/13 23:01:59 INFO TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool 
19/11/13 23:01:59 INFO DAGScheduler: ResultStage 4 (collect at BoundedDataset.java:76) finished in 0.088 s
19/11/13 23:01:59 INFO DAGScheduler: Job 1 finished: collect at BoundedDataset.java:76, took 0.732977 s
19/11/13 23:01:59 INFO MemoryStore: Block broadcast_7 stored as values in memory (estimated size 832.0 B, free 13.5 GB)
19/11/13 23:01:59 INFO MemoryStore: Block broadcast_7_piece0 stored as bytes in memory (estimated size 981.0 B, free 13.5 GB)
19/11/13 23:01:59 INFO BlockManagerInfo: Added broadcast_7_piece0 in memory on localhost:46531 (size: 981.0 B, free: 13.5 GB)
19/11/13 23:01:59 INFO SparkContext: Created broadcast 7 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/13 23:01:59 INFO SparkContext: Starting job: collect at BoundedDataset.java:76
19/11/13 23:01:59 INFO DAGScheduler: Got job 2 (collect at BoundedDataset.java:76) with 4 output partitions
19/11/13 23:01:59 INFO DAGScheduler: Final stage: ResultStage 5 (collect at BoundedDataset.java:76)
19/11/13 23:01:59 INFO DAGScheduler: Parents of final stage: List()
19/11/13 23:01:59 INFO DAGScheduler: Missing parents: List()
19/11/13 23:01:59 INFO DAGScheduler: Submitting ResultStage 5 (MapPartitionsRDD[36] at map at BoundedDataset.java:75), which has no missing parents
19/11/13 23:01:59 INFO MemoryStore: Block broadcast_8 stored as values in memory (estimated size 30.9 KB, free 13.5 GB)
19/11/13 23:01:59 INFO MemoryStore: Block broadcast_8_piece0 stored as bytes in memory (estimated size 12.4 KB, free 13.5 GB)
19/11/13 23:01:59 INFO BlockManagerInfo: Added broadcast_8_piece0 in memory on localhost:46531 (size: 12.4 KB, free: 13.5 GB)
19/11/13 23:01:59 INFO SparkContext: Created broadcast 8 from broadcast at DAGScheduler.scala:1161
19/11/13 23:01:59 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 5 (MapPartitionsRDD[36] at map at BoundedDataset.java:75) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
19/11/13 23:01:59 INFO TaskSchedulerImpl: Adding task set 5.0 with 4 tasks
19/11/13 23:01:59 INFO TaskSetManager: Starting task 0.0 in stage 5.0 (TID 20, localhost, executor driver, partition 0, PROCESS_LOCAL, 7868 bytes)
19/11/13 23:01:59 INFO TaskSetManager: Starting task 1.0 in stage 5.0 (TID 21, localhost, executor driver, partition 1, PROCESS_LOCAL, 7868 bytes)
19/11/13 23:01:59 INFO TaskSetManager: Starting task 2.0 in stage 5.0 (TID 22, localhost, executor driver, partition 2, PROCESS_LOCAL, 7868 bytes)
19/11/13 23:01:59 INFO TaskSetManager: Starting task 3.0 in stage 5.0 (TID 23, localhost, executor driver, partition 3, PROCESS_LOCAL, 7879 bytes)
19/11/13 23:01:59 INFO Executor: Running task 2.0 in stage 5.0 (TID 22)
19/11/13 23:01:59 INFO Executor: Running task 3.0 in stage 5.0 (TID 23)
19/11/13 23:01:59 INFO Executor: Running task 1.0 in stage 5.0 (TID 21)
19/11/13 23:01:59 INFO Executor: Running task 0.0 in stage 5.0 (TID 20)
19/11/13 23:01:59 INFO BlockManager: Found block rdd_19_3 locally
19/11/13 23:01:59 INFO BlockManager: Found block rdd_19_0 locally
19/11/13 23:01:59 INFO BlockManager: Found block rdd_19_1 locally
19/11/13 23:01:59 INFO BlockManager: Found block rdd_19_2 locally
19/11/13 23:01:59 INFO Executor: Finished task 2.0 in stage 5.0 (TID 22). 10082 bytes result sent to driver
19/11/13 23:01:59 INFO TaskSetManager: Finished task 2.0 in stage 5.0 (TID 22) in 52 ms on localhost (executor driver) (1/4)
19/11/13 23:01:59 INFO Executor: Finished task 1.0 in stage 5.0 (TID 21). 10082 bytes result sent to driver
19/11/13 23:01:59 INFO TaskSetManager: Finished task 1.0 in stage 5.0 (TID 21) in 66 ms on localhost (executor driver) (2/4)
19/11/13 23:01:59 INFO Executor: Finished task 0.0 in stage 5.0 (TID 20). 10082 bytes result sent to driver
19/11/13 23:01:59 INFO TaskSetManager: Finished task 0.0 in stage 5.0 (TID 20) in 75 ms on localhost (executor driver) (3/4)
WARNING:root:Deleting 4 existing files in target path matching: -*-of-%(num_shards)05d
19/11/13 23:02:16 INFO Executor: Finished task 3.0 in stage 5.0 (TID 23). 10125 bytes result sent to driver
19/11/13 23:02:16 INFO TaskSetManager: Finished task 3.0 in stage 5.0 (TID 23) in 17163 ms on localhost (executor driver) (4/4)
19/11/13 23:02:16 INFO TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool 
19/11/13 23:02:16 INFO DAGScheduler: ResultStage 5 (collect at BoundedDataset.java:76) finished in 17.176 s
19/11/13 23:02:16 INFO DAGScheduler: Job 2 finished: collect at BoundedDataset.java:76, took 17.180078 s
19/11/13 23:02:16 INFO MemoryStore: Block broadcast_9 stored as values in memory (estimated size 176.0 B, free 13.5 GB)
19/11/13 23:02:16 INFO MemoryStore: Block broadcast_9_piece0 stored as bytes in memory (estimated size 702.0 B, free 13.5 GB)
19/11/13 23:02:16 INFO BlockManagerInfo: Added broadcast_9_piece0 in memory on localhost:46531 (size: 702.0 B, free: 13.5 GB)
19/11/13 23:02:16 INFO SparkContext: Created broadcast 9 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/13 23:02:16 INFO MemoryStore: Block broadcast_10 stored as values in memory (estimated size 832.0 B, free 13.5 GB)
19/11/13 23:02:16 INFO MemoryStore: Block broadcast_10_piece0 stored as bytes in memory (estimated size 981.0 B, free 13.5 GB)
19/11/13 23:02:16 INFO BlockManagerInfo: Added broadcast_10_piece0 in memory on localhost:46531 (size: 981.0 B, free: 13.5 GB)
19/11/13 23:02:16 INFO SparkContext: Created broadcast 10 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/13 23:02:16 INFO MemoryStore: Block broadcast_11 stored as values in memory (estimated size 288.0 B, free 13.5 GB)
19/11/13 23:02:16 INFO MemoryStore: Block broadcast_11_piece0 stored as bytes in memory (estimated size 802.0 B, free 13.5 GB)
19/11/13 23:02:16 INFO BlockManagerInfo: Added broadcast_11_piece0 in memory on localhost:46531 (size: 802.0 B, free: 13.5 GB)
19/11/13 23:02:16 INFO SparkContext: Created broadcast 11 from broadcast at SparkBatchPortablePipelineTranslator.java:336
19/11/13 23:02:16 INFO SparkPipelineRunner: Job BeamApp-jenkins-1113230154-f528f1c1_c4a33b52-e3a0-4592-9734-2b19bc6c485a: Pipeline translated successfully. Computing outputs
19/11/13 23:02:16 INFO SparkContext: Starting job: foreach at BoundedDataset.java:124
19/11/13 23:02:16 INFO DAGScheduler: Got job 3 (foreach at BoundedDataset.java:124) with 4 output partitions
19/11/13 23:02:16 INFO DAGScheduler: Final stage: ResultStage 6 (foreach at BoundedDataset.java:124)
19/11/13 23:02:16 INFO DAGScheduler: Parents of final stage: List()
19/11/13 23:02:16 INFO DAGScheduler: Missing parents: List()
19/11/13 23:02:16 INFO DAGScheduler: Submitting ResultStage 6 (EmptyOutputSink_0 MapPartitionsRDD[38] at flatMap at SparkBatchPortablePipelineTranslator.java:311), which has no missing parents
19/11/13 23:02:16 INFO MemoryStore: Block broadcast_12 stored as values in memory (estimated size 32.8 KB, free 13.5 GB)
19/11/13 23:02:16 INFO MemoryStore: Block broadcast_12_piece0 stored as bytes in memory (estimated size 12.9 KB, free 13.5 GB)
19/11/13 23:02:16 INFO BlockManagerInfo: Added broadcast_12_piece0 in memory on localhost:46531 (size: 12.9 KB, free: 13.5 GB)
19/11/13 23:02:16 INFO SparkContext: Created broadcast 12 from broadcast at DAGScheduler.scala:1161
19/11/13 23:02:16 INFO DAGScheduler: Submitting 4 missing tasks from ResultStage 6 (EmptyOutputSink_0 MapPartitionsRDD[38] at flatMap at SparkBatchPortablePipelineTranslator.java:311) (first 15 tasks are for partitions Vector(0, 1, 2, 3))
19/11/13 23:02:16 INFO TaskSchedulerImpl: Adding task set 6.0 with 4 tasks
19/11/13 23:02:16 INFO TaskSetManager: Starting task 0.0 in stage 6.0 (TID 24, localhost, executor driver, partition 0, PROCESS_LOCAL, 7868 bytes)
19/11/13 23:02:16 INFO TaskSetManager: Starting task 1.0 in stage 6.0 (TID 25, localhost, executor driver, partition 1, PROCESS_LOCAL, 7868 bytes)
19/11/13 23:02:16 INFO TaskSetManager: Starting task 2.0 in stage 6.0 (TID 26, localhost, executor driver, partition 2, PROCESS_LOCAL, 7868 bytes)
19/11/13 23:02:16 INFO TaskSetManager: Starting task 3.0 in stage 6.0 (TID 27, localhost, executor driver, partition 3, PROCESS_LOCAL, 7879 bytes)
19/11/13 23:02:16 INFO Executor: Running task 0.0 in stage 6.0 (TID 24)
19/11/13 23:02:16 INFO Executor: Running task 3.0 in stage 6.0 (TID 27)
19/11/13 23:02:16 INFO Executor: Running task 1.0 in stage 6.0 (TID 25)
19/11/13 23:02:16 INFO Executor: Running task 2.0 in stage 6.0 (TID 26)
19/11/13 23:02:16 INFO BlockManager: Found block rdd_19_0 locally
19/11/13 23:02:16 INFO BlockManager: Found block rdd_19_3 locally
19/11/13 23:02:16 INFO BlockManager: Found block rdd_19_1 locally
19/11/13 23:02:16 INFO BlockManager: Found block rdd_19_2 locally
19/11/13 23:02:16 INFO Executor: Finished task 0.0 in stage 6.0 (TID 24). 9341 bytes result sent to driver
19/11/13 23:02:16 INFO TaskSetManager: Finished task 0.0 in stage 6.0 (TID 24) in 58 ms on localhost (executor driver) (1/4)
19/11/13 23:02:16 INFO Executor: Finished task 1.0 in stage 6.0 (TID 25). 9341 bytes result sent to driver
19/11/13 23:02:16 INFO TaskSetManager: Finished task 1.0 in stage 6.0 (TID 25) in 63 ms on localhost (executor driver) (2/4)
19/11/13 23:02:16 INFO Executor: Finished task 2.0 in stage 6.0 (TID 26). 9341 bytes result sent to driver
19/11/13 23:02:16 INFO TaskSetManager: Finished task 2.0 in stage 6.0 (TID 26) in 73 ms on localhost (executor driver) (3/4)
INFO:root:Starting finalize_write threads with num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:root:Renamed 4 shards in 0.10 seconds.
19/11/13 23:02:33 INFO Executor: Finished task 3.0 in stage 6.0 (TID 27). 9427 bytes result sent to driver
19/11/13 23:02:33 INFO TaskSetManager: Finished task 3.0 in stage 6.0 (TID 27) in 16616 ms on localhost (executor driver) (4/4)
19/11/13 23:02:33 INFO TaskSchedulerImpl: Removed TaskSet 6.0, whose tasks have all completed, from pool 
19/11/13 23:02:33 INFO DAGScheduler: ResultStage 6 (foreach at BoundedDataset.java:124) finished in 16.625 s
19/11/13 23:02:33 INFO DAGScheduler: Job 3 finished: foreach at BoundedDataset.java:124, took 16.629480 s
19/11/13 23:02:33 INFO SparkPipelineRunner: Job BeamApp-jenkins-1113230154-f528f1c1_c4a33b52-e3a0-4592-9734-2b19bc6c485a finished.
19/11/13 23:02:33 INFO SparkUI: Stopped Spark web UI at http://localhost:4040
19/11/13 23:02:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/11/13 23:02:33 INFO MemoryStore: MemoryStore cleared
19/11/13 23:02:33 INFO BlockManager: BlockManager stopped
19/11/13 23:02:33 INFO BlockManagerMaster: BlockManagerMaster stopped
19/11/13 23:02:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/11/13 23:02:33 INFO SparkContext: Successfully stopped SparkContext
19/11/13 23:02:33 WARN SparkPipelineResult$BatchMode: Collecting monitoring infos is not implemented yet in Spark portable runner.
19/11/13 23:02:33 INFO AbstractArtifactRetrievalService: Manifest at /tmp/beam-tempyqv9id5o/artifactsls2l5z6q/job_d2fb8f41-3342-4fca-aea0-abd80ab2acde/MANIFEST has 1 artifact locations
19/11/13 23:02:33 INFO BeamFileSystemArtifactStagingService: Removed dir /tmp/beam-tempyqv9id5o/artifactsls2l5z6q/job_d2fb8f41-3342-4fca-aea0-abd80ab2acde/
INFO:root:Job state changed to DONE
19/11/13 23:02:33 INFO InMemoryJobService: Getting job metrics for BeamApp-jenkins-1113230154-f528f1c1_c4a33b52-e3a0-4592-9734-2b19bc6c485a
19/11/13 23:02:33 INFO InMemoryJobService: Finished getting job metrics for BeamApp-jenkins-1113230154-f528f1c1_c4a33b52-e3a0-4592-9734-2b19bc6c485a
19/11/13 23:02:33 INFO ShutdownHookManager: Shutdown hook called
19/11/13 23:02:33 INFO ShutdownHookManager: Deleting directory /tmp/spark-fe19309d-c457-40fc-937c-6c779f9f302f
Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 148, in run
    for work_request in control_stub.Control(get_responses()):
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573686153.919346429","description":"Error received from peer ipv4:127.0.0.1:35647","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 607, in pull_responses
    for response in responses:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573686153.919295140","description":"Error received from peer ipv4:127.0.0.1:46041","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

ERROR:root:Failed to read inputs in the data plane.
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573686153.919239774","description":"Error received from peer ipv4:127.0.0.1:44491","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573686153.919239774","description":"Error received from peer ipv4:127.0.0.1:44491","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py36:postCommitPy36

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/build.gradle'> line: 46

* What went wrong:
Execution failed for task ':sdks:python:sdist'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 3m 48s
81 actionable tasks: 60 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/pgvzvfbgywc4y

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #987

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/987/display/redirect?page=changes>

Changes:

[lukecwik] [BEAM-8575] Windows idempotency: Applying the same window fn (or wind…


------------------------------------------
[...truncated 116.68 KB...]
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... SKIP: This test doesn't work on DirectRunner.
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 24.194s

OK (SKIP=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_59_12-16471722422024578268?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_14_27-12058495852534473971?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_22_21-12968650120487758479?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_30_56-16364955325248859231?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_38_34-15491482146434202197?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_59_09-15473273328697066784?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_22_39-6742171972952643985?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_31_04-5103476388344138875?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_39_55-800620294863721072?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_59_12-4622777261845108943?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_12_07-8007245831193731559?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_20_35-9077723894879792195?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_38_42-6186870328413760997?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_59_09-6170946332257084467?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_19_45-6391667217800690242?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_28_23-6256870999071000091?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_37_16-7477614685363173964?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_59_09-17007177046061079391?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_08_17-14240006904775958947?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_16_11-15594291452590956774?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_24_34-18269006757302020808?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_32_42-5249428852955728560?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_40_15-15991443867090672290?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_59_08-12127582196587183183?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_07_44-2165586584063161699?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_16_57-4103724518418699286?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_25_59-564763904787924522?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_33_36-9421924301988033933?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_59_10-7928392958846298887?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_08_44-2576694781072756576?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_20_10-3164908055946548964?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_29_17-16903014185549126917?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_59_11-10779218812482642289?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_08_21-11069752918667746363?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_16_12-5908167975999882795?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_23_47-13463741412467603881?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_14_32_03-7513213149799164304?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3036.917s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:test-suites:portable:py36:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 51m 35s
79 actionable tasks: 58 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/jsv3dfot5u2pk

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python36 #986

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python36/986/display/redirect?page=changes>

Changes:

[github] Beam changes to bypass recording timing for first N times seeing a new


------------------------------------------
[...truncated 613.53 KB...]
	details = "Socket closed"
	debug_error_string = "{"created":"@1573677862.282038452","description":"Error received from peer ipv4:127.0.0.1:40083","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.6/threading.py", line 864, in run
    self._target(*self._args, **self._kwargs)
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py",> line 272, in _read_inputs
    for elements in elements_iterator:
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 392, in __next__
    return self._next()
  File "<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/build/gradleenv/2022703440/lib/python3.6/site-packages/grpc/_channel.py",> line 561, in _next
    raise self
grpc._channel._Rendezvous: <_Rendezvous of RPC that terminated with:
	status = StatusCode.UNAVAILABLE
	details = "Socket closed"
	debug_error_string = "{"created":"@1573677862.281858250","description":"Error received from peer ipv4:127.0.0.1:38227","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py36:postCommitPy36

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
(unset)
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_42_27-15950430297505925219?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_56_46-13577691611751284272?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_05_10-16183411115295606529?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_14_36-15186091317732571428?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_22_34-16992407908525115684?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_42_24-11216128625767995368?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_02_11-13482002494790297120?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_11_05-6751084741069402416?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_19_44-13129887367533148826?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_42_26-13476611919365447654?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_55_05-18383928034019907616?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_03_36-7107058280537644189?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_42_23-2468909454401726298?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_51_46-10245439968014071256?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_02_16-1967939023002148889?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_12_15-2156209574767704181?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_21_05-17127088798661974589?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_42_24-13278623593721521132?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_03_46-1571497898797294918?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_11_52-5798079673377009060?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_20_44-4513687464155021995?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_42_23-3092438930474130633?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_50_04-6586089988255368072?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_58_26-8237140079553718398?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_07_38-8609734381154309114?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_14_54-8504654764952375954?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_42_25-4597573967095506696?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_51_04-17563346098056610963?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_59_08-4923161665633035686?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_07_40-9482950222423811880?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_16_52-16532142093112161003?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_25_00-12887891035314052022?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_42_24-2641188339585383266?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_51_22-4185572946794272909?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_12_59_47-8538757455936912653?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_07_46-4564489576674392141?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-13_13_16_08-9013601737545834835?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:723: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3049.894s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/py36/build.gradle'> line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 51m 53s
82 actionable tasks: 61 executed, 21 from cache

Publishing build scan...
https://gradle.com/s/5qo3gdofnjx7e

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org