You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2020/01/09 00:04:45 UTC

Build failed in Jenkins: beam_PostCommit_Python35 #1407

See <https://builds.apache.org/job/beam_PostCommit_Python35/1407/display/redirect?page=changes>

Changes:

[github] Add # pytype: skip-file before first import statement in each py file


------------------------------------------
Started by GitHub push by boyuanzz
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-13 (beam) in workspace <https://builds.apache.org/job/beam_PostCommit_Python35/ws/>
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 7547ac6b273e6e2ffe7d69775606e14c0fd455b2 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 7547ac6b273e6e2ffe7d69775606e14c0fd455b2
Commit message: "Add # pytype: skip-file before first import statement in each py file (#10533)"
 > git rev-list --no-walk 4f846531ca7a98353e2af80d967181fe5fa63e60 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[Gradle] - Launching build.
[src] $ <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/gradlew> --continue --max-workers=12 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx4g :python35PostCommit
Starting a Gradle Daemon, 1 busy Daemon could not be reused, use --status for details
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy
> Task :buildSrc:spotlessGroovyCheck
> Task :buildSrc:spotlessGroovyGradle
> Task :buildSrc:spotlessGroovyGradleCheck
> Task :buildSrc:spotlessCheck
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties FROM-CACHE
> Task :buildSrc:check
> Task :buildSrc:build
Configuration on demand is an incubating feature.

> Configure project :sdks:java:container
Found go 1.12 in /usr/bin/go, use it.

> Configure project :sdks:python:container
Found go 1.12 in /usr/bin/go, use it.

FAILURE: Build failed with an exception.

* What went wrong:
Could not determine the dependencies of task ':runners:spark:job-server:shadowJar'.
> Could not resolve all dependencies for configuration ':runners:spark:job-server:runtimeClasspath'.
   > Could not resolve net.minidev:json-smart:[1.3.1,2.3].
     Required by:
         project :runners:spark:job-server > project :runners:spark > org.apache.hadoop:hadoop-common:2.8.5 > org.apache.hadoop:hadoop-auth:2.8.5 > com.nimbusds:nimbus-jose-jwt:4.41.1
      > Could not resolve net.minidev:json-smart:2.3-SNAPSHOT.
         > Unable to load Maven meta-data from https://oss.sonatype.org/content/repositories/staging/net/minidev/json-smart/2.3-SNAPSHOT/maven-metadata.xml.
            > Could not get resource 'https://oss.sonatype.org/content/repositories/staging/net/minidev/json-smart/2.3-SNAPSHOT/maven-metadata.xml'.
               > Could not GET 'https://oss.sonatype.org/content/repositories/staging/net/minidev/json-smart/2.3-SNAPSHOT/maven-metadata.xml'.
                  > Read timed out

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 50s

Publishing build scan...
https://gradle.com/s/gbitfaomc3whg

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python35 #1409

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/1409/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python35 #1408

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python35/1408/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-9030] Bump grpc to 1.26.0

[sunjincheng121] [BEAM-9030] Update the dependencies to make sure the dependency linkage

[sunjincheng121] fixup

[sunjincheng121] fixup


------------------------------------------
[...truncated 2.71 MB...]
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    },
                    {
                      "@type": "FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
                      "component_encodings": [],
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                    }
                  ],
                  "is_pair_like": true,
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_6"
                },
                {
                  "@type": "kind:global_window"
                }
              ],
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "m_out.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s11"
        },
        "serialized_fn": "ref_AppliedPTransform_m_out_17",
        "user_name": "m_out"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: '2020-01-09T01:49:43.564405Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-01-08_17_49_42-1352193968755426111'
 location: 'us-central1'
 name: 'beamapp-jenkins-0109014915-707756'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-01-09T01:49:43.564405Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-01-08_17_49_42-1352193968755426111]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_49_42-1352193968755426111?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-08_17_49_42-1352193968755426111 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:42.235Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-01-08_17_49_42-1352193968755426111. The number of workers will be between 1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:42.235Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-01-08_17_49_42-1352193968755426111.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:45.789Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:46.748Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.222Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.245Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.293Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.320Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.340Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.360Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.387Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.424Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.441Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2595>) into Create/Impulse
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.462Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2595>)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.489Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.514Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.541Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.567Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.596Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.622Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.648Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.673Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Map(decode)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.699Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.724Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.748Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.772Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.797Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.825Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.841Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.865Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:47.890Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.055Z: JOB_MESSAGE_DEBUG: Executing wait step start23
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.110Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.136Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.146Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.171Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.207Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.220Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.265Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.287Z: JOB_MESSAGE_DEBUG: Value "Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:49:48.345Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:2595>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:50:00.680Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:50:13.500Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:50:50.947Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:50:50.979Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:54:59.521Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:2595>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:54:59.578Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:54:59.620Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:54:59.678Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:55:21.490Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:55:21.560Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:55:21.605Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:55:21.684Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:55:26.973Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:55:27.035Z: JOB_MESSAGE_DEBUG: Executing success step success21
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:55:27.146Z: JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:55:27.189Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:55:27.208Z: JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:56:59.288Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:56:59.333Z: JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-09T01:56:59.373Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-08_17_49_42-1352193968755426111 is in state JOB_STATE_DONE
--------------------- >> end captured logging << ---------------------
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_16_16-17678984055443835503?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_30_46-4389657948384805513?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:259: FutureWarning: _ReadFromBigQuery is experimental.
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_38_57-17934303559232032031?project=apache-beam-testing
  query=self.query, use_standard_sql=True, project=self.project))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_46_54-1039511866676371970?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_16_09-10791506899004806587?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_36_00-6007761488248413404?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_44_52-7541574665764043949?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_52_37-15419265605248371453?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:757: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_16_13-16066899679051762750?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_28_49-14424799375319240203?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_37_03-4339803770332181857?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_46_22-5597018002740881691?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_54_54-18205738857856099217?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:298: FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309: FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309: FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_16_35-15836011728274264318?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1418: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_38_05-13390347440529599350?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_55_25-14305289197533217198?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_16_11-4867696881260417205?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_25_52-8193089577057669710?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_35_06-14928402115916290700?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_44_04-1479059389296733060?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_51_51-3214138726490565390?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_59_54-3632297290977965474?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_16_11-3113000607845454052?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_24_24-567569948864018995?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_33_22-96769348997578408?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_41_42-13082402761475672504?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_49_42-1352193968755426111?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_57_36-5337105904068563483?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:155: FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_16_12-18174587188170681804?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_25_33-2424085795876664993?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_33_47-15950254007130499877?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_41_28-9310583867118077994?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_49_19-10279896718376065611?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:75: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead.
  kms_key=kms_key))
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_16_11-971710014320769864?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_25_10-10955827617014971854?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_32_56-17995653878713418001?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_41_11-8694769328186034821?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_50_00-5378411136865092997?project=apache-beam-testing
Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-08_17_57_55-3490852206222736465?project=apache-beam-testing

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py35.xml
----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 50 tests in 3137.972s

FAILED (SKIP=9, failures=1)

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 53m 34s
84 actionable tasks: 65 executed, 19 from cache

Publishing build scan...
https://gradle.com/s/vqlf5btzxqixa

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org