You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/11/28 20:48:59 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #6685

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6685/display/redirect?page=changes>

Changes:

[millsd] Add an option to create Dataflow piplines from a snapshot

------------------------------------------
[...truncated 411.01 KB...]
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20181106"
      }
    ]
  }, 
  "name": "beamapp-jenkins-1128202605-346195", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)", 
        "bigquery_use_legacy_sql": false, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15434367645", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLarkCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwCu1BVY", 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"STRING\", \"name\": \"fruit\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Create job: <Job
 createTime: u'2018-11-28T20:26:19.099124Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-11-28_12_26_18-9096341250788214203'
 location: u'us-central1'
 name: u'beamapp-jenkins-1128202605-346195'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2018-11-28T20:26:19.099124Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-11-28_12_26_18-9096341250788214203]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_26_18-9096341250788214203?project=apache-beam-testing
root: INFO: Job 2018-11-28_12_26_18-9096341250788214203 is in state JOB_STATE_RUNNING
root: INFO: 2018-11-28T20:26:18.175Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-11-28_12_26_18-9096341250788214203. The number of workers will be between 1 and 1000.
root: INFO: 2018-11-28T20:26:18.253Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-11-28_12_26_18-9096341250788214203.
root: INFO: 2018-11-28T20:26:21.151Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-11-28T20:26:22.243Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2018-11-28T20:26:22.916Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-11-28T20:26:23.013Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-11-28T20:26:23.042Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-11-28T20:26:23.071Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-11-28T20:26:23.248Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-11-28T20:26:23.284Z: JOB_MESSAGE_DETAILED: Fusing consumer write/NativeWrite into read
root: INFO: 2018-11-28T20:26:23.325Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-11-28T20:26:23.366Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-11-28T20:26:23.412Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-11-28T20:26:23.443Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-11-28T20:26:23.594Z: JOB_MESSAGE_DEBUG: Executing wait step start3
root: INFO: 2018-11-28T20:26:23.670Z: JOB_MESSAGE_BASIC: Executing operation read+write/NativeWrite
root: INFO: 2018-11-28T20:26:23.724Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-11-28T20:26:23.764Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2018-11-28T20:26:27.387Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_2432959462828815615". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_2432959462828815615".
root: INFO: 2018-11-28T20:26:34.051Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-11-28T20:27:02.092Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-11-28T20:28:08.025Z: JOB_MESSAGE_BASIC: BigQuery query completed, job : "dataflow_job_2432959462828815615"
root: INFO: 2018-11-28T20:28:08.399Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_18345827657290598385" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_18345827657290598385".
root: INFO: 2018-11-28T20:28:38.715Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_18345827657290598385" observed total of 1 exported files thus far.
root: INFO: 2018-11-28T20:28:38.754Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_18345827657290598385"
root: INFO: 2018-11-28T20:28:53.894Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-11-28T20:28:53.924Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-11-28T20:28:55.009Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-112820260-11281226-8wec-harness-vqdc. Please refer to the worker-startup log for detailed information.
root: INFO: 2018-11-28T20:28:58.661Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-112820260-11281226-8wec-harness-vqdc. Please refer to the worker-startup log for detailed information.
root: INFO: 2018-11-28T20:29:15.926Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-112820260-11281226-8wec-harness-vqdc. Please refer to the worker-startup log for detailed information.
root: INFO: 2018-11-28T20:29:45.925Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-112820260-11281226-8wec-harness-vqdc. Please refer to the worker-startup log for detailed information.
root: INFO: 2018-11-28T20:29:45.950Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_2432959462828816745". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_2432959462828816745".
root: INFO: 2018-11-28T20:29:47.355Z: JOB_MESSAGE_DEBUG: Executing failure step failure2
root: INFO: 2018-11-28T20:29:47.393Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S01:read+write/NativeWrite failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-112820260-11281226-8wec-harness-vqdc,
  beamapp-jenkins-112820260-11281226-8wec-harness-vqdc,
  beamapp-jenkins-112820260-11281226-8wec-harness-vqdc,
  beamapp-jenkins-112820260-11281226-8wec-harness-vqdc
root: INFO: 2018-11-28T20:29:47.500Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-11-28T20:29:47.580Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-11-28T20:29:47.623Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-11-28T20:32:14.543Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
root: INFO: 2018-11-28T20:32:14.593Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2018-11-28T20:32:14.641Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-11-28T20:32:14.690Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-11-28_12_26_18-9096341250788214203 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 18 tests in 2964.297s

FAILED (errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_11_59_49-9532403667341579846?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_07_06-17528815923856674280?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_15_10-6246047193791590043?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_21_25-5322538106275128039?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_28_05-15191525909008017646?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_35_10-12756323091361154854?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_41_51-15027088113137798851?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_11_59_51-7193432073112779844?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_15_00-13241147710981575181?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_11_59_50-14248252158617924255?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_11_59_50-9749204278689066815?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_13_14-789501114324695635?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_19_42-1646671265726935956?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_26_18-9096341250788214203?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_11_59_49-15402794232521469618?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_11_59_49-477692027947995763?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_07_36-6159384365199494952?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_11_59_49-12668301757394372524?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_07_42-3531504371484442780?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_15_17-13427280356923116033?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_22_39-12080238044508835673?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_11_59_49-14834077439228847792?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_12_08_55-12192615889639445411?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 49 mins 25.089 secs.

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 213

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 319

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 52m 58s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/qcb67257j5lzy

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_Verify #6687

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6687/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #6686

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6686/display/redirect?page=changes>

Changes:

[github] Update Java Container beam-master-20181128

------------------------------------------
[...truncated 408.24 KB...]
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1128234013-045152.1543448413.045286/setuptools-40.4.3.zip", 
            "name": "setuptools-40.4.3.zip"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1128234013-045152.1543448413.045286/funcsigs-1.0.2.tar.gz", 
            "name": "funcsigs-1.0.2.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1128234013-045152.1543448413.045286/pbr-4.1.0.tar.gz", 
            "name": "pbr-4.1.0.tar.gz"
          }, 
          {
            "location": "storage.googleapis.com/temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-1128234013-045152.1543448413.045286/dataflow_python_sdk.tar", 
            "name": "dataflow_python_sdk.tar"
          }
        ], 
        "taskrunnerSettings": {
          "parallelWorkerSettings": {
            "baseUrl": "https://dataflow.googleapis.com", 
            "servicePath": "https://dataflow.googleapis.com"
          }
        }, 
        "workerHarnessContainerImage": "gcr.io/cloud-dataflow/v1beta3/python:beam-master-20181106"
      }
    ]
  }, 
  "name": "beamapp-jenkins-1128234013-045152", 
  "steps": [
    {
      "kind": "ParallelRead", 
      "name": "s1", 
      "properties": {
        "bigquery_export_format": "FORMAT_AVRO", 
        "bigquery_flatten_results": true, 
        "bigquery_query": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)", 
        "bigquery_use_legacy_sql": false, 
        "display_data": [
          {
            "key": "source", 
            "label": "Read Source", 
            "namespace": "apache_beam.io.iobase.Read", 
            "shortValue": "BigQuerySource", 
            "type": "STRING", 
            "value": "apache_beam.io.gcp.bigquery.BigQuerySource"
          }, 
          {
            "key": "query", 
            "label": "Query", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "STRING", 
            "value": "SELECT * FROM (SELECT \"apple\" as fruit) UNION ALL (SELECT \"orange\" as fruit)"
          }, 
          {
            "key": "validation", 
            "label": "Validation Enabled", 
            "namespace": "apache_beam.io.gcp.bigquery.BigQuerySource", 
            "type": "BOOLEAN", 
            "value": false
          }
        ], 
        "format": "bigquery", 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "read.out"
          }
        ], 
        "user_name": "read"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s2", 
      "properties": {
        "create_disposition": "CREATE_IF_NEEDED", 
        "dataset": "python_query_to_table_15434484125317", 
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLarkCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwCu1BVY", 
              "component_encodings": []
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "bigquery", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "schema": "{\"fields\": [{\"type\": \"STRING\", \"name\": \"fruit\", \"mode\": \"NULLABLE\"}]}", 
        "table": "output_table", 
        "user_name": "write/NativeWrite", 
        "write_disposition": "WRITE_EMPTY"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Create job: <Job
 createTime: u'2018-11-28T23:40:26.693544Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-11-28_15_40_25-4477463378756086364'
 location: u'us-central1'
 name: u'beamapp-jenkins-1128234013-045152'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2018-11-28T23:40:26.693544Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-11-28_15_40_25-4477463378756086364]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_40_25-4477463378756086364?project=apache-beam-testing
root: INFO: Job 2018-11-28_15_40_25-4477463378756086364 is in state JOB_STATE_RUNNING
root: INFO: 2018-11-28T23:40:25.728Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2018-11-28_15_40_25-4477463378756086364. The number of workers will be between 1 and 1000.
root: INFO: 2018-11-28T23:40:25.879Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2018-11-28_15_40_25-4477463378756086364.
root: INFO: 2018-11-28T23:40:28.685Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2018-11-28T23:40:30.177Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2018-11-28T23:40:30.761Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2018-11-28T23:40:30.813Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2018-11-28T23:40:30.863Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-11-28T23:40:30.904Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-11-28T23:40:31.095Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2018-11-28T23:40:31.139Z: JOB_MESSAGE_DETAILED: Fusing consumer write/NativeWrite into read
root: INFO: 2018-11-28T23:40:31.190Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2018-11-28T23:40:31.241Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2018-11-28T23:40:31.288Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-11-28T23:40:31.326Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-11-28T23:40:31.536Z: JOB_MESSAGE_DEBUG: Executing wait step start3
root: INFO: 2018-11-28T23:40:31.642Z: JOB_MESSAGE_BASIC: Executing operation read+write/NativeWrite
root: INFO: 2018-11-28T23:40:31.699Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-11-28T23:40:31.742Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2018-11-28T23:40:35.378Z: JOB_MESSAGE_BASIC: BigQuery query issued as job: "dataflow_job_3922155939095548356". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_3922155939095548356".
root: INFO: 2018-11-28T23:40:53.261Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-11-28T23:41:34.128Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-11-28T23:42:17.506Z: JOB_MESSAGE_BASIC: BigQuery query completed, job : "dataflow_job_3922155939095548356"
root: INFO: 2018-11-28T23:42:17.946Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_7667303689285975083" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_7667303689285975083".
root: INFO: 2018-11-28T23:42:48.194Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_7667303689285975083" observed total of 1 exported files thus far.
root: INFO: 2018-11-28T23:42:48.232Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_7667303689285975083"
root: INFO: 2018-11-28T23:43:13.497Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-11-28T23:43:13.543Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-11-28T23:43:14.527Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-112823401-11281540-32vw-harness-g016. Please refer to the worker-startup log for detailed information.
root: INFO: 2018-11-28T23:43:17.223Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-112823401-11281540-32vw-harness-g016. Please refer to the worker-startup log for detailed information.
root: INFO: 2018-11-28T23:43:34.955Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-112823401-11281540-32vw-harness-g016. Please refer to the worker-startup log for detailed information.
root: INFO: 2018-11-28T23:44:06.942Z: JOB_MESSAGE_ERROR: A setup error was detected in beamapp-jenkins-112823401-11281540-32vw-harness-g016. Please refer to the worker-startup log for detailed information.
root: INFO: 2018-11-28T23:44:06.948Z: JOB_MESSAGE_BASIC: Executing BigQuery import job "dataflow_job_3922155939095549834". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_3922155939095549834".
root: INFO: 2018-11-28T23:44:08.427Z: JOB_MESSAGE_DEBUG: Executing failure step failure2
root: INFO: 2018-11-28T23:44:08.472Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S01:read+write/NativeWrite failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-112823401-11281540-32vw-harness-g016,
  beamapp-jenkins-112823401-11281540-32vw-harness-g016,
  beamapp-jenkins-112823401-11281540-32vw-harness-g016,
  beamapp-jenkins-112823401-11281540-32vw-harness-g016
root: INFO: 2018-11-28T23:44:08.600Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-11-28T23:44:08.775Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-11-28T23:44:08.813Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-11-28T23:46:01.614Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
root: INFO: 2018-11-28T23:46:01.649Z: JOB_MESSAGE_DETAILED: Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job.
root: INFO: 2018-11-28T23:46:01.704Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-11-28T23:46:01.740Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-11-28_15_40_25-4477463378756086364 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 18 tests in 2920.289s

FAILED (errors=1)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_14_54-13615104888960877507?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_14_52-9400679678638519468?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_14_53-6075801465497724306?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_27_27-14319807732445100338?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_33_39-10331277531000647912?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_40_25-4477463378756086364?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_14_53-1645609252412955697?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_14_52-8509492374639455184?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_22_22-12276281355777158618?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_29_27-15654332561877465757?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_35_31-5642543654261662647?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_42_36-7024208794640864508?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_49_56-13079128125814320692?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_56_46-18005924194867197100?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_14_51-3935835598119192817?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_21_45-9023454064259273224?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_29_16-10702784156677370842?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_14_52-2268146554431745424?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_22_47-15536715932658200225?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_29_59-10522734332246375463?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_36_51-7197678719886572878?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_14_52-1040098932496000975?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-11-28_15_23_31-789894470961171233?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 48 mins 41.374 secs.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 319

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 53m 11s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/7jglysn2bfuw6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org