You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/06/07 12:36:46 UTC

Build failed in Jenkins: beam_PostCommit_Python3_Verify #1069

See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1069/display/redirect?page=changes>

Changes:

[katarzyna.kucharczyk] [BEAM-5995] Fix: cron job naming

------------------------------------------
[...truncated 654.20 KB...]
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": []
                    }
                  ],
                  "is_pair_like": true
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s24"
        },
        "serialized_fn": "eNq9VG1z20QQPtlO2qoxJQk0KS3gFgoOUAsor6UJNE7SJqZuUEN8BYw4SWdLjd72dGriIZ7hZZTJ7+Cv8IEv/ChWZ6chQPqR0eik3X32ub3dvf2pXHdYwhyPWzZnYUMKFqW9WIRpw4kF15ssCJgd8I5gScLFSrwW6UAWfgZtCKU6PUMISUTs8DSFsuP6QdCwilW3HMGZ5FYvixzpx+hUqZ+wBzFzLTlIuA4T9CzSNGOXb6EMkzmcMeFsvaW1CL6l1myzekjIPiG/aqSvkYdwrp2DvkA19NqD8zlM0RR/DS8OufGYRzt+lB59b6QBe8KN3VjspHhMbhSntDbjVDbjMPSltTmQXhzdtLa58HsDIxWOkbo7qZEovfG33BjHuTGK3DSSAVRV6LcDFtouW4Ln7lcnmgQu0BJqexE8n8P0goQZE2ZPHL7PpcWkFDq8oAjszA8kRgsvqoyiubDCxQOYM2H+hKsfJrGQVhi7WYC5u0Qvo8MzKggv5XDZhCtqHwtJHGlZ8PIBvGLCq3SyUHLIWAC19n/Vz+EowFWvUvfGFZlozWBF/pSEHKqKDDUyWCJSOxJLxf+oWMMy2S+R/TLZKROxTmSJuNpY0yuRi4j4RSOd6EdSkYjRifiDaJq2t1G4r3SXybBCBtNkXyOPK2S/UjBqHWCInlDo3wr0iHTUH8ekjxBG8e2gs/idnAKKNEJdgg11rU3nMBNrzA+4W2NpyoW8VbsuaouLuMJrB/B6nVYQEfiphOsqbSmWgbvwBp1FYRkzf0e5re45PCk6Ht6k59BStPSqELGAunITPIyfcFigOgrbLMjG1rckvD1CMEcW9XiHVlHgewl3cB9L7XyDXni6s3VkgoZCjrVjb0M1Eg94yCMJ70p4jyb/yx3hKTZy38ikHxQX5H2v1tptXiPa1ERZm1JPWZsrVbUqfqfVekWbxBVuqg59eqgPcvgQr85HJnzszXuX6Pw/23y0UaPYCD7J4VMTbnnY1p+ZcNurtb2rXVhsEZbDkgmf5/DFEO7Q80W3FzPH8vxIprB8cvShQekbLsebw2QsUn39QVHAe4VahybOvZX2EFbrisqPkkwqvhTW2nQKVXEmj3V329kB3LOxbusmbOTQMuHLHO4PoV33lr2C7AGSbda9tbansF/ZoxCZ6KeYBgsniOltZBIemrD1r+i/VhTbSNE5pqA2LRc4dH3kbWV2F74Zwrdd+O6ZU77jR268i+nUoYuc3w/BqtMZ5JF+iHlmYWI5cWj7ERfwQ0uj06pNnSzMAlb0ejGOODC0qCD91HJ5j2WBBPtQ3QIp/H6fC4zKOS2QMURfGXlujUVwMSCuemNXRYkcvdM4Rgj9bhDbLBgdCsvWRwYvsyX4jb8AaI0ltw==",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-06-07T12:27:30.543572Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-06-07_05_27_29-14725952395645672865'
 location: 'us-central1'
 name: 'beamapp-jenkins-0607114912-803158'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-06-07T12:27:30.543572Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-06-07_05_27_29-14725952395645672865]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_27_29-14725952395645672865?project=apache-beam-testing
root: INFO: Job 2019-06-07_05_27_29-14725952395645672865 is in state JOB_STATE_RUNNING
root: INFO: 2019-06-07T12:27:29.078Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-06-07_05_27_29-14725952395645672865. The number of workers will be between 1 and 1000.
root: INFO: 2019-06-07T12:27:29.157Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-06-07_05_27_29-14725952395645672865.
root: INFO: 2019-06-07T12:27:32.404Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
root: INFO: 2019-06-07T12:27:33.136Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
root: INFO: 2019-06-07T12:27:33.685Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
root: INFO: 2019-06-07T12:27:33.728Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-06-07T12:27:33.782Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step read from datastore/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2019-06-07T12:27:33.804Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
root: INFO: 2019-06-07T12:27:33.843Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-06-07T12:27:33.971Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2019-06-07T12:27:34.012Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2019-06-07T12:27:34.042Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/SplitQuery into read from datastore/UserQuery/Read
root: INFO: 2019-06-07T12:27:34.080Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into read from datastore/Reshuffle/AddRandomKeys
root: INFO: 2019-06-07T12:27:34.119Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
root: INFO: 2019-06-07T12:27:34.159Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
root: INFO: 2019-06-07T12:27:34.196Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
root: INFO: 2019-06-07T12:27:34.244Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read
root: INFO: 2019-06-07T12:27:34.271Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
root: INFO: 2019-06-07T12:27:34.305Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
root: INFO: 2019-06-07T12:27:34.335Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/AddRandomKeys into read from datastore/SplitQuery
root: INFO: 2019-06-07T12:27:34.371Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write into read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify
root: INFO: 2019-06-07T12:27:34.399Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial into Globally/CombineGlobally(CountCombineFn)/KeyWithVoid
root: INFO: 2019-06-07T12:27:34.432Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/KeyWithVoid into read from datastore/Read
root: INFO: 2019-06-07T12:27:34.475Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify into read from datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
root: INFO: 2019-06-07T12:27:34.505Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Read into read from datastore/Reshuffle/RemoveRandomKeys
root: INFO: 2019-06-07T12:27:34.540Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/RemoveRandomKeys into read from datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
root: INFO: 2019-06-07T12:27:34.572Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/UnKey into Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
root: INFO: 2019-06-07T12:27:34.612Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
root: INFO: 2019-06-07T12:27:34.650Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-06-07T12:27:34.681Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-06-07T12:27:34.718Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read
root: INFO: 2019-06-07T12:27:34.756Z: JOB_MESSAGE_DETAILED: Unzipping flatten s21 for input s19.out
root: INFO: 2019-06-07T12:27:34.800Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-06-07T12:27:34.835Z: JOB_MESSAGE_DETAILED: Unzipping flatten s21-u40 for input s22-reify-value18-c38
root: INFO: 2019-06-07T12:27:34.873Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2019-06-07T12:27:34.913Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-06-07T12:27:34.956Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-06-07T12:27:34.991Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-06-07T12:27:35.019Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-06-07T12:27:35.053Z: JOB_MESSAGE_DETAILED: Fusing consumer Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into Globally/CombineGlobally(CountCombineFn)/DoOnce/Read
root: INFO: 2019-06-07T12:27:35.090Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-06-07T12:27:35.116Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
root: INFO: 2019-06-07T12:27:35.156Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
root: INFO: 2019-06-07T12:27:35.193Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
root: INFO: 2019-06-07T12:27:35.215Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2019-06-07T12:27:35.252Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-06-07T12:27:35.393Z: JOB_MESSAGE_DEBUG: Executing wait step start51
root: INFO: 2019-06-07T12:27:35.489Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
root: INFO: 2019-06-07T12:27:35.526Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2019-06-07T12:27:35.538Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2019-06-07T12:27:35.567Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Create
root: INFO: 2019-06-07T12:27:35.573Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
root: INFO: 2019-06-07T12:27:35.642Z: JOB_MESSAGE_DEBUG: Value "Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" materialized.
root: INFO: 2019-06-07T12:27:35.680Z: JOB_MESSAGE_DEBUG: Value "read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
root: INFO: 2019-06-07T12:27:35.718Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-06-07T12:27:35.747Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/UserQuery/Read+read from datastore/SplitQuery+read from datastore/Reshuffle/AddRandomKeys+read from datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-06-07T12:27:35.775Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-06-07T12:28:44.247Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2019-06-07T12:29:38.613Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-06-07T12:29:38.654Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2019-06-07T12:32:01.877Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-06-07T12:32:01.986Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read from datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read from datastore/Reshuffle/RemoveRandomKeys+read from datastore/Read+Globally/CombineGlobally(CountCombineFn)/KeyWithVoid+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
root: INFO: 2019-06-07T12:32:12.010Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Close
root: INFO: 2019-06-07T12:32:12.120Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract+Globally/CombineGlobally(CountCombineFn)/UnKey
root: INFO: 2019-06-07T12:32:20.811Z: JOB_MESSAGE_DEBUG: Value "Globally/CombineGlobally(CountCombineFn)/UnKey.out" materialized.
root: INFO: 2019-06-07T12:32:20.888Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0)
root: INFO: 2019-06-07T12:32:20.993Z: JOB_MESSAGE_DEBUG: Value "Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0).output" materialized.
root: INFO: 2019-06-07T12:32:21.083Z: JOB_MESSAGE_BASIC: Executing operation Globally/CombineGlobally(CountCombineFn)/DoOnce/Read+Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-06-07T12:32:26.632Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
root: INFO: 2019-06-07T12:32:26.728Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
root: INFO: 2019-06-07T12:32:35.980Z: JOB_MESSAGE_DEBUG: Executing success step success49
root: INFO: 2019-06-07T12:32:36.109Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-06-07T12:32:36.180Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2019-06-07T12:32:36.216Z: JOB_MESSAGE_BASIC: Stopping worker pool...
--------------------- >> end captured logging << ---------------------
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_40_41-4318871746770569537?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_56_41-16638429099466479781?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_05_13-8586131631978816794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_13_40-17558325224099196139?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_21_21-10603722675126097143?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_40_41-4894125892017301095?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_03_57-10080758728269393862?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_13_58-8064331153011830608?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_40_51-4185952938210716518?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_54_41-14983190191632702758?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_04_45-7201804459423737774?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_40_59-7626963168623374437?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_02_00-7585431912756645777?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_10_38-4866370777412968384?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_20_03-7050588243463195028?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_40_37-17073241479750870628?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_52_45-11017005079432836654?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_03_53-18392084704163804831?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_11_36-16913678338791385289?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_40_37-8637649931738325024?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_49_05-15923605174621264969?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_58_08-1678104390732890214?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_07_09-5413314292379573532?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_40_39-2562582553352328919?project=apache-beam-testing.
Exception in thread Thread-84:
Traceback (most recent call last):
  File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner
    self.run()
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_49_37-10678711611686835179?project=apache-beam-testing.
  File "/usr/lib/python3.5/threading.py", line 862, in run
    self._target(*self._args, **self._kwargs)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_58_58-3584147765440104096?project=apache-beam-testing.
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 157, in poll_for_job_completion
    response = runner.dataflow_client.get_job(job_id)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_08_10-8185338245263387190?project=apache-beam-testing.
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper
    return fun(*args, **kwargs)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_17_56-2116510855614329434?project=apache-beam-testing.
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 668, in get_job
    response = self._client.projects_locations_jobs.Get(request)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_27_29-14725952395645672865?project=apache-beam-testing.
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get
    config, request, global_params=global_params)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse
    http_response, method_config=method_config, request=request)
apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-06-07_05_27_29-14725952395645672865?alt=json>: response: <{'date': 'Fri, 07 Jun 2019 12:33:25 GMT', 'server': 'ESF', 'transfer-encoding': 'chunked', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'cache-control': 'private', 'status': '404', 'x-xss-protection': '0', '-content-encoding': 'gzip', 'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'content-length': '280'}>, content <{
  "error": {
    "code": 404,
    "message": "(9443ab17794dff0e): Information about job 2019-06-07_05_27_29-14725952395645672865 could not be found in our system. Please double check the id is correct. If it is please contact customer support.",
    "status": "NOT_FOUND"
  }
}
>

<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_40_41-3072538181385807257?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_50_58-11163484600370946025?project=apache-beam-testing.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_04_59_01-11624641319444460127?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_07_53-6079461187765197300?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_18_28-8118384005411018175?project=apache-beam-testing.

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3400.727s

FAILED (SKIP=5, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 57m 44s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/qqfm6x6ceved4

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python3_Verify #1071

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1071/display/redirect?page=changes>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python3_Verify #1070

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1070/display/redirect>

------------------------------------------
[...truncated 578.96 KB...]
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_11-16429361772171558269?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_47_52-13086614286806418895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_56_33-6756701139436005306?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_07_44-6113642012995322433?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_17_45-4126743842816010547?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_26_52-16035342005431304103?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_13-3828482321483591406?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_48_40-15242779991439051895?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_56_47-12333938442821757272?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_06_09-16737443504174405?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_14_27-16426570711246905074?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_13-7671207373608637871?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_50_28-6710049227730375249?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_00_45-2922503754531823964?project=apache-beam-testing.
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_09_19-12076241445348957101?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_17_50-11186452286042144201?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3445.311s

OK (SKIP=5)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_40-4122523599343017471?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_53_55-10624435881705264223?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_02_02-7133324961204386990?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_11_40-6881369196205963930?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_30-10879443668644620547?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_02_07-14766894789355217206?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_12_14-1444315573981859398?project=apache-beam-testing.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_31-17926065714439956428?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_52_46-5411613666490338953?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_02_13-6558839716255977510?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_11_29-17779012287710035712?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_20_10-1820654819173720180?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_32-10942273540766565111?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_59_39-17435150221243678881?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_08_26-16575418830859550432?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_17_25-9167803141836530874?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_29-9796022516443134260?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_47_41-15511356990977356339?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_57_08-4882159782122516034?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_08_14-4191971450710567242?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_29-14689979844299474697?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_48_43-1773346860011495732?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_59_27-5974080410878175591?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_09_18-12383324765632520118?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_32-14571865076973878932?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_48_07-11105067855280790045?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_57_00-17420732748930327186?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_06_27-7324483872614856424?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_14_45-5730585568513216715?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_38_31-8582796622799335352?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_47_58-7949872783077212772?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_05_57_04-7770052171750036106?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_08_02-11403890569376165869?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_16_43-14531797403861302094?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-06-07_06_26_37-16180876408260304208?project=apache-beam-testing.
test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. 
test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok
test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_multiple_destinations_transform_streaming (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... SKIP: TestStream is not supported on TestDataflowRunner
test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok
test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok
test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok
test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 42 tests in 3575.104s

OK (SKIP=5)

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py35/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/direct/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 35s
77 actionable tasks: 60 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/2gj53wzetxtla

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org