You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/12/10 14:05:29 UTC

Build failed in Jenkins: beam_PostCommit_Python_Verify #6788

See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6788/display/redirect?page=changes>

Changes:

[gleb] [BEAM-5866] Override structuralValue in ListCoder

[gleb] [BEAM-5866] Override structuralValue in MapCoder

[github] Clarify usage of PipelineOptions subclass

[coheigea] Move string literals to the left hand side of the expression in a few

[coheigea] Upgrade to Apache Tika 19.1

[robertwb] [BEAM-4444] Parquet IO for Python SDK (#6763)

------------------------------------------
[...truncated 150.11 KB...]
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../sdks/python ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/sdks/
+ cp -r ./apache_beam/io/hdfs_integration_test/../../../../../model ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration/
++ echo hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6788
+ PROJECT_NAME=hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6788
+ '[' -z jenkins-beam_PostCommit_Python_Verify-6788 ']'
+ COLOR_OPT=--no-ansi
+ COMPOSE_OPT='-p hdfs_IT-jenkins-beam_PostCommit_Python_Verify-6788 --no-ansi'
+ cd ./apache_beam/io/hdfs_integration_test/../../../../../build/hdfs_integration
+ docker network prune --force
runtime/cgo: pthread_create failed: Resource temporarily unavailable
SIGABRT: abort
PC=0x7f65f80a3428 m=0

goroutine 0 [idle]:

goroutine 1 [running, locked to thread]:
runtime.systemstack_switch()
	/usr/local/go/src/runtime/asm_amd64.s:252 fp=0xc42005fe48 sp=0xc42005fe40
runtime.newproc(0x0, 0xee95c8)
	/usr/local/go/src/runtime/proc.go:2713 +0x8b fp=0xc42005fe90 sp=0xc42005fe48
runtime.init.3()
	/usr/local/go/src/runtime/proc.go:213 +0x35 fp=0xc42005feb0 sp=0xc42005fe90
runtime.init()
	/usr/local/go/src/runtime/write_err.go:14 +0x2ee fp=0xc42005ff48 sp=0xc42005feb0
runtime.main()
	/usr/local/go/src/runtime/proc.go:141 +0xf6 fp=0xc42005ffa0 sp=0xc42005ff48
runtime.goexit()
	/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1 fp=0xc42005ffa8 sp=0xc42005ffa0

goroutine 17 [syscall, locked to thread]:
runtime.goexit()
	/usr/local/go/src/runtime/asm_amd64.s:2086 +0x1

rax    0x0
rbx    0x7f65f8433700
rcx    0x7f65f80a3428
rdx    0x6
rdi    0x668d
rsi    0x668d
rbp    0xf1d11e
rsp    0x7ffc1f294008
r8     0x7f65f8434770
r9     0x7f65f8a6d700
r10    0x8
r11    0x206
r12    0x33081a0
r13    0xf3
r14    0x30
r15    0x3
rip    0x7f65f80a3428
rflags 0x206
cs     0x33
fs     0x0
gs     0x0
:beam-sdks-python:hdfsIntegrationTest (Thread[Task worker for ':',5,main]) completed. Took 0.067 secs.
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) started.

> Task :beam-sdks-python:postCommitIT
Caching disabled for task ':beam-sdks-python:postCommitIT': Caching has not been enabled for the task
Task ':beam-sdks-python:postCommitIT' is not up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'sh''. Working directory: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python> Command: sh -c . <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/bin/activate> && ./scripts/run_integration_test.sh --test_opts "--nocapture --processes=8 --process-timeout=4500 --attr=IT"
Successfully started process 'command 'sh''


###########################################################################
# Build pipeline options if not provided in --pipeline_opts from commandline

if [[ -z $PIPELINE_OPTS ]]; then

  # Check that the script is running in a known directory.
  if [[ $PWD != *sdks/python* ]]; then
    echo 'Unable to locate Apache Beam Python SDK root directory'
    exit 1
  fi

  # Go to the Apache Beam Python SDK root
  if [[ "*sdks/python" != $PWD ]]; then
    cd $(pwd | sed 's/sdks\/python.*/sdks\/python/')
  fi

  # Create a tarball if not exists
  if [[ $(find ${SDK_LOCATION}) ]]; then
    SDK_LOCATION=$(find ${SDK_LOCATION})
  else
    python setup.py -q sdist
    SDK_LOCATION=$(find dist/apache-beam-*.tar.gz)
  fi

  # Install test dependencies for ValidatesRunner tests.
  echo "pyhamcrest" > postcommit_requirements.txt
  echo "mock" >> postcommit_requirements.txt

  # Options used to run testing pipeline on Cloud Dataflow Service. Also used for
  # running on DirectRunner (some options ignored).
  opts=(
    "--runner=$RUNNER"
    "--project=$PROJECT"
    "--staging_location=$GCS_LOCATION/staging-it"
    "--temp_location=$GCS_LOCATION/temp-it"
    "--output=$GCS_LOCATION/py-it-cloud/output"
    "--sdk_location=$SDK_LOCATION"
    "--requirements_file=postcommit_requirements.txt"
    "--num_workers=$NUM_WORKERS"
    "--sleep_secs=$SLEEP_SECS"
  )

  # Add --streaming if provided
  if [[ "$STREAMING" = true ]]; then
    opts+=("--streaming")
  fi

  # Add --dataflow_worker_jar if provided
  if [[ ! -z "$WORKER_JAR" ]]; then
    opts+=("--dataflow_worker_jar=$WORKER_JAR")
  fi

  PIPELINE_OPTS=$(IFS=" " ; echo "${opts[*]}")

fi
pwd | sed 's/sdks\/python.*/sdks\/python/'
./scripts/run_integration_test.sh: fork: retry: Resource temporarily unavailable
./scripts/run_integration_test.sh: fork: retry: No child processes
find ${SDK_LOCATION}
find ${SDK_LOCATION}
IFS=" " ; echo "${opts[*]}"


###########################################################################
# Run tests and validate that jobs finish successfully.

echo ">>> RUNNING integration tests with pipeline options: $PIPELINE_OPTS"
python setup.py nosetests \
  --test-pipeline-options="$PIPELINE_OPTS" \
  $TEST_OPTS
>>> RUNNING integration tests with pipeline options: --runner=TestDataflowRunner --project=apache-beam-testing --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output --sdk_location=build/apache-beam.tar.gz --requirements_file=postcommit_requirements.txt --num_workers=1 --sleep_secs=20
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/dist.py>:470: UserWarning: Normalizing '2.10.0.dev' to '2.10.0.dev0'
  normalized_version,
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
Process SyncManager-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 558, in _run_server
    server.serve_forever()
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 184, in serve_forever
    t.start()
  File "/usr/lib/python2.7/threading.py", line 736, in start
    _start_new_thread(self.__bootstrap, ())
error: can't start new thread
Traceback (most recent call last):
  File "setup.py", line 239, in <module>
    'test': generate_protos_first(test),
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/setuptools/__init__.py",> line 143, in setup
    return distutils.core.setup(**attrs)
  File "/usr/lib/python2.7/distutils/core.py", line 151, in setup
    dist.run_commands()
  File "/usr/lib/python2.7/distutils/dist.py", line 953, in run_commands
    self.run_command(cmd)
  File "/usr/lib/python2.7/distutils/dist.py", line 972, in run_command
    cmd_obj.run()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/commands.py",> line 158, in run
    TestProgram(argv=argv, config=self.__config)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/core.py",> line 121, in __init__
    **extra_args)
  File "/usr/lib/python2.7/unittest/main.py", line 95, in __init__
    self.runTests()
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/core.py",> line 207, in runTests
    result = self.testRunner.run(self.test)
  File "<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/build/gradleenv/1327086738/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 365, in run
    testQueue = Queue()
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 667, in temp
    token, exp = self._create(typeid, *args, **kwds)
  File "/usr/lib/python2.7/multiprocessing/managers.py", line 565, in _create
    conn = self._Client(self._address, authkey=self._authkey)
  File "/usr/lib/python2.7/multiprocessing/connection.py", line 175, in Client
    answer_challenge(c, authkey)
  File "/usr/lib/python2.7/multiprocessing/connection.py", line 432, in answer_challenge
    message = connection.recv_bytes(256)         # reject large message
EOFError

> Task :beam-sdks-python:postCommitIT FAILED
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 3.512 secs.

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 159

* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 313

* What went wrong:
Execution failed for task ':beam-sdks-python:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 274

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

BUILD FAILED in 45s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/wkkfpyf22gtru

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Jenkins build is back to normal : beam_PostCommit_Python_Verify #6790

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6790/display/redirect>


---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org


Build failed in Jenkins: beam_PostCommit_Python_Verify #6789

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/6789/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-6079] Add ability for CassandraIO to delete data

[iemejia] [BEAM-6079] Fix access level and clean up generics issues

------------------------------------------
[...truncated 1.14 MB...]
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "ReadFromPubSub/Map(_from_proto_str).out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s1"
        }, 
        "serialized_fn": "ref_AppliedPTransform_ReadFromPubSub/Map(_from_proto_str)_4", 
        "user_name": "ReadFromPubSub/Map(_from_proto_str)"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s3", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "add_attribute"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", 
                      "component_encodings": [], 
                      "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                    }
                  ], 
                  "is_pair_like": true, 
                  "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_3"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "add_attribute.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "ref_AppliedPTransform_add_attribute_5", 
        "user_name": "add_attribute"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s4", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "to_proto_str"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:bytes"
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "WriteToPubSub/ToProtobuf.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s3"
        }, 
        "serialized_fn": "ref_AppliedPTransform_WriteToPubSub/ToProtobuf_7", 
        "user_name": "WriteToPubSub/ToProtobuf"
      }
    }, 
    {
      "kind": "ParallelWrite", 
      "name": "s5", 
      "properties": {
        "display_data": [], 
        "encoding": {
          "@type": "kind:windowed_value", 
          "component_encodings": [
            {
              "@type": "kind:bytes"
            }, 
            {
              "@type": "kind:global_window"
            }
          ], 
          "is_wrapper": true
        }, 
        "format": "pubsub", 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s4"
        }, 
        "pubsub_id_label": "id", 
        "pubsub_serialized_attributes_fn": "", 
        "pubsub_timestamp_label": "timestamp", 
        "pubsub_topic": "projects/apache-beam-testing/topics/psit_topic_output349ef07d-0943-47a4-a416-9b29a8731644", 
        "user_name": "WriteToPubSub/Write/NativeWrite"
      }
    }
  ], 
  "type": "JOB_TYPE_STREAMING"
}
root: INFO: Create job: <Job
 createTime: u'2018-12-10T16:46:48.654574Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-12-10_08_46_45-4661422501458528405'
 location: u'us-central1'
 name: u'beamapp-jenkins-1210164636-429196'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2018-12-10T16:46:48.654574Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)>
root: INFO: Created job with id: [2018-12-10_08_46_45-4661422501458528405]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_45-4661422501458528405?project=apache-beam-testing
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
XML: <https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 19 tests in 1887.088s

FAILED (errors=5, failures=6)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_31-485909928086186037?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_27-2746877262249700722?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_28-16554391804803232926?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_27-10136866855711679295?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_27-16707170736943022074?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_43_59-14550076557050917467?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_45_15-16396278941296888174?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_18-13056604862306843484?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_45-4661422501458528405?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_26-15932434091359080518?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_44_04-3692776795664872478?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_45_11-1132200751986424363?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_16-5927959844519459605?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_28-18153654139549890044?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_43_23-3015796963868094813?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_43_45-7614230511101895736?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_44_05-483436329258915908?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_36_27-18165015813611854332?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_44_08-12554187595833307794?project=apache-beam-testing.
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-10_08_46_19-6828310122637119890?project=apache-beam-testing.

> Task :beam-sdks-python:postCommitIT FAILED
:beam-sdks-python:postCommitIT (Thread[Task worker for ':',5,main]) completed. Took 31 mins 28.072 secs.

FAILURE: Build failed with an exception.

* Where:
Build file '<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'> line: 274

* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 35m 28s
6 actionable tasks: 6 executed

Publishing build scan...
https://gradle.com/s/7rv5yqqf2newe

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org