You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/04/16 06:04:21 UTC

Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #56

See <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/56/display/redirect>

------------------------------------------
[...truncated 1.06 KB...]
Commit message: "This closes #5028"
 > git rev-list --no-walk 9e3e9c4d0a0dc1574c8956c7f8379b37ba262cb2 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins877046262130145167.sh
+ gcloud container clusters get-credentials io-datastores --zone=us-central1-a --verbosity=debug
DEBUG: Running gcloud.container.clusters.get-credentials with Namespace(__calliope_internal_deepest_parser=ArgumentParser(prog='gcloud.container.clusters.get-credentials', usage=None, description='See https://cloud.google.com/container-engine/docs/kubectl for\nkubectl documentation.', version=None, formatter_class=<class 'argparse.HelpFormatter'>, conflict_handler='error', add_help=False), account=None, api_version=None, authority_selector=None, authorization_token_file=None, cmd_func=<bound method Command.Run of <googlecloudsdk.calliope.backend.Command object at 0x7f45eba8f510>>, command_path=['gcloud', 'container', 'clusters', 'get-credentials'], configuration=None, credential_file_override=None, document=None, format=None, h=None, help=None, http_timeout=None, log_http=None, name='io-datastores', project=None, quiet=None, trace_email=None, trace_log=None, trace_token=None, user_output_enabled=None, verbosity='debug', version=None, zone='us-central1-a').
WARNING: Accessing a Container Engine cluster requires the kubernetes commandline
client [kubectl]. To install, run
  $ gcloud components install kubectl

Fetching cluster endpoint and auth data.
DEBUG: Saved kubeconfig to /home/jenkins/.kube/config
kubeconfig entry generated for io-datastores.
INFO: Display format "default".
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins646752526505326975.sh
+ cp /home/jenkins/.kube/config <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088>
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins5473807303277619747.sh
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> create namespace mongodbioit-1523854877088
namespace "mongodbioit-1523854877088" created
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins8317633776261829245.sh
++ kubectl config current-context
+ kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> config set-context gke_apache-beam-testing_us-central1-a_io-datastores --namespace=mongodbioit-1523854877088
Context "gke_apache-beam-testing_us-central1-a_io-datastores" modified.
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins6144272670483496440.sh
+ rm -rf PerfKitBenchmarker
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins5883485065134233483.sh
+ rm -rf .env
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins7318326992821682375.sh
+ virtualenv .env --system-site-packages
New python executable in <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/.env/bin/python>
Installing setuptools, pip, wheel...done.
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins8797345790548082373.sh
+ .env/bin/pip install --upgrade setuptools pip
Requirement already up-to-date: setuptools in ./.env/lib/python2.7/site-packages (39.0.1)
Requirement already up-to-date: pip in ./.env/lib/python2.7/site-packages (10.0.0)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
apache-beam 2.5.0.dev0 requires hdfs<3.0.0,>=2.1.0, which is not installed.
apache-beam 2.5.0.dev0 requires pytz>=2018.3, which is not installed.
apache-beam 2.5.0.dev0 has requirement grpcio<2,>=1.8, but you'll have grpcio 1.3.5 which is incompatible.
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins2097619316824930501.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git
Cloning into 'PerfKitBenchmarker'...
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins3548842757274470321.sh
+ .env/bin/pip install -r PerfKitBenchmarker/requirements.txt
Requirement already satisfied: absl-py in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 14)) (0.1.3)
Requirement already satisfied: jinja2>=2.7 in /usr/local/lib/python2.7/dist-packages (from -r PerfKitBenchmarker/requirements.txt (line 15)) (2.8)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 16)) (39.0.1)
Requirement already satisfied: colorlog[windows]==2.6.0 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 17)) (2.6.0)
Requirement already satisfied: blinker>=1.3 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 18)) (1.4)
Requirement already satisfied: futures>=3.0.3 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 19)) (3.1.1)
Requirement already satisfied: PyYAML==3.12 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 20)) (3.12)
Requirement already satisfied: pint>=0.7 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 21)) (0.7.2)
Collecting numpy==1.13.3 (from -r PerfKitBenchmarker/requirements.txt (line 22))
  Using cached numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
Requirement already satisfied: functools32 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 23)) (3.2.3.post2)
Requirement already satisfied: contextlib2>=0.5.1 in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 24)) (0.5.4)
Requirement already satisfied: pywinrm in /home/jenkins/.local/lib/python2.7/site-packages (from -r PerfKitBenchmarker/requirements.txt (line 25)) (0.2.2)
Requirement already satisfied: six in /home/jenkins/.local/lib/python2.7/site-packages (from absl-py->-r PerfKitBenchmarker/requirements.txt (line 14)) (1.11.0)
Requirement already satisfied: MarkupSafe in /usr/local/lib/python2.7/dist-packages (from jinja2>=2.7->-r PerfKitBenchmarker/requirements.txt (line 15)) (0.23)
Requirement already satisfied: colorama; extra == "windows" in /usr/lib/python2.7/dist-packages (from colorlog[windows]==2.6.0->-r PerfKitBenchmarker/requirements.txt (line 17)) (0.2.5)
Requirement already satisfied: xmltodict in /home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (0.11.0)
Requirement already satisfied: requests-ntlm>=0.3.0 in /home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (1.1.0)
Requirement already satisfied: requests>=2.9.1 in /home/jenkins/.local/lib/python2.7/site-packages (from pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (2.18.1)
Requirement already satisfied: ntlm-auth>=1.0.2 in /home/jenkins/.local/lib/python2.7/site-packages (from requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (1.0.6)
Requirement already satisfied: cryptography>=1.3 in /home/jenkins/.local/lib/python2.7/site-packages (from requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (2.1.2)
Requirement already satisfied: idna<2.6,>=2.5 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (2.5)
Requirement already satisfied: urllib3<1.22,>=1.21.1 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (1.21.1)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (3.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.9.1->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (2017.4.17)
Requirement already satisfied: cffi>=1.7; platform_python_implementation != "PyPy" in /home/jenkins/.local/lib/python2.7/site-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (1.11.2)
Requirement already satisfied: enum34; python_version < "3" in /home/jenkins/.local/lib/python2.7/site-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (1.1.6)
Requirement already satisfied: asn1crypto>=0.21.0 in /home/jenkins/.local/lib/python2.7/site-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (0.23.0)
Requirement already satisfied: ipaddress; python_version < "3" in /usr/local/lib/python2.7/dist-packages (from cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (1.0.19)
Requirement already satisfied: pycparser in /home/jenkins/.local/lib/python2.7/site-packages (from cffi>=1.7; platform_python_implementation != "PyPy"->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r PerfKitBenchmarker/requirements.txt (line 25)) (2.18)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
apache-beam 2.5.0.dev0 requires hdfs<3.0.0,>=2.1.0, which is not installed.
apache-beam 2.5.0.dev0 requires pytz>=2018.3, which is not installed.
apache-beam 2.5.0.dev0 has requirement grpcio<2,>=1.8, but you'll have grpcio 1.3.5 which is incompatible.
Installing collected packages: numpy
  Found existing installation: numpy 1.11.3
    Not uninstalling numpy at /home/jenkins/.local/lib/python2.7/site-packages, outside environment <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/.env>
    Can't uninstall 'numpy'. No files were found to uninstall.
Successfully installed numpy-1.13.3
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins5294900468332871899.sh
+ .env/bin/pip install -e 'src/sdks/python/[gcp,test]'
Obtaining file://<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/python>
Requirement already satisfied: avro<2.0.0,>=1.8.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (1.8.2)
Requirement already satisfied: crcmod<2.0,>=1.7 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (1.7)
Requirement already satisfied: dill==0.2.6 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (0.2.6)
Collecting grpcio<2,>=1.8 (from apache-beam==2.5.0.dev0)
  Using cached grpcio-1.11.0-cp27-cp27mu-manylinux1_x86_64.whl
Collecting hdfs<3.0.0,>=2.1.0 (from apache-beam==2.5.0.dev0)
Requirement already satisfied: httplib2<0.10,>=0.8 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (0.9.2)
Requirement already satisfied: mock<3.0.0,>=1.0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (2.0.0)
Requirement already satisfied: oauth2client<5,>=2.0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (3.0.0)
Requirement already satisfied: protobuf<4,>=3.5.0.post1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (3.5.0.post1)
Collecting pytz>=2018.3 (from apache-beam==2.5.0.dev0)
  Using cached pytz-2018.4-py2.py3-none-any.whl
Requirement already satisfied: pyyaml<4.0.0,>=3.12 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (3.12)
Requirement already satisfied: pyvcf<0.7.0,>=0.6.8 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (0.6.8)
Requirement already satisfied: six<1.12,>=1.9 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (1.11.0)
Requirement already satisfied: typing<3.7.0,>=3.6.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (3.6.2)
Requirement already satisfied: futures<4.0.0,>=3.1.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (3.1.1)
Requirement already satisfied: google-apitools<=0.5.20,>=0.5.18 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (0.5.20)
Requirement already satisfied: proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (0.90.4)
Requirement already satisfied: googledatastore==7.0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (7.0.1)
Requirement already satisfied: google-cloud-pubsub==0.26.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (0.26.0)
Requirement already satisfied: proto-google-cloud-pubsub-v1==0.15.4 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (0.15.4)
Requirement already satisfied: google-cloud-bigquery==0.25.0 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (0.25.0)
Collecting nose>=1.3.7 (from apache-beam==2.5.0.dev0)
  Using cached nose-1.3.7-py2-none-any.whl
Requirement already satisfied: pyhamcrest<2.0,>=1.9 in /home/jenkins/.local/lib/python2.7/site-packages (from apache-beam==2.5.0.dev0) (1.9.0)
Requirement already satisfied: enum34>=1.0.4 in /home/jenkins/.local/lib/python2.7/site-packages (from grpcio<2,>=1.8->apache-beam==2.5.0.dev0) (1.1.6)
Requirement already satisfied: requests>=2.7.0 in /home/jenkins/.local/lib/python2.7/site-packages (from hdfs<3.0.0,>=2.1.0->apache-beam==2.5.0.dev0) (2.18.1)
Requirement already satisfied: docopt in /usr/local/lib/python2.7/dist-packages (from hdfs<3.0.0,>=2.1.0->apache-beam==2.5.0.dev0) (0.6.2)
Requirement already satisfied: funcsigs>=1; python_version < "3.3" in /home/jenkins/.local/lib/python2.7/site-packages (from mock<3.0.0,>=1.0.1->apache-beam==2.5.0.dev0) (1.0.2)
Requirement already satisfied: pbr>=0.11 in /home/jenkins/.local/lib/python2.7/site-packages (from mock<3.0.0,>=1.0.1->apache-beam==2.5.0.dev0) (3.1.0)
Requirement already satisfied: pyasn1>=0.1.7 in /home/jenkins/.local/lib/python2.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam==2.5.0.dev0) (0.2.3)
Requirement already satisfied: pyasn1-modules>=0.0.5 in /home/jenkins/.local/lib/python2.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam==2.5.0.dev0) (0.0.9)
Requirement already satisfied: rsa>=3.1.4 in /home/jenkins/.local/lib/python2.7/site-packages (from oauth2client<5,>=2.0.1->apache-beam==2.5.0.dev0) (3.4.2)
Requirement already satisfied: setuptools in ./.env/lib/python2.7/site-packages (from protobuf<4,>=3.5.0.post1->apache-beam==2.5.0.dev0) (39.0.1)
Requirement already satisfied: fasteners>=0.14 in /home/jenkins/.local/lib/python2.7/site-packages (from google-apitools<=0.5.20,>=0.5.18->apache-beam==2.5.0.dev0) (0.14.1)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.5.2 in /home/jenkins/.local/lib/python2.7/site-packages (from proto-google-cloud-datastore-v1<=0.90.4,>=0.90.0->apache-beam==2.5.0.dev0) (1.5.2)
Requirement already satisfied: google-cloud-core<0.26dev,>=0.25.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0) (0.25.0)
Requirement already satisfied: gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0) (0.15.4)
Requirement already satisfied: idna<2.6,>=2.5 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.5.0.dev0) (2.5)
Requirement already satisfied: urllib3<1.22,>=1.21.1 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.5.0.dev0) (1.21.1)
Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.5.0.dev0) (3.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /home/jenkins/.local/lib/python2.7/site-packages (from requests>=2.7.0->hdfs<3.0.0,>=2.1.0->apache-beam==2.5.0.dev0) (2017.4.17)
Requirement already satisfied: monotonic>=0.1 in /home/jenkins/.local/lib/python2.7/site-packages (from fasteners>=0.14->google-apitools<=0.5.20,>=0.5.18->apache-beam==2.5.0.dev0) (1.4)
Requirement already satisfied: google-auth<2.0.0dev,>=0.4.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0) (1.0.1)
Requirement already satisfied: google-auth-httplib2 in /home/jenkins/.local/lib/python2.7/site-packages (from google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0) (0.0.2)
Requirement already satisfied: google-gax<0.16dev,>=0.15.7 in /home/jenkins/.local/lib/python2.7/site-packages (from gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0) (0.15.13)
Requirement already satisfied: grpc-google-iam-v1<0.12dev,>=0.11.1 in /home/jenkins/.local/lib/python2.7/site-packages (from gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0) (0.11.1)
Requirement already satisfied: cachetools>=2.0.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-auth<2.0.0dev,>=0.4.0->google-cloud-core<0.26dev,>=0.25.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0) (2.0.0)
Requirement already satisfied: future<0.17dev,>=0.16.0 in /home/jenkins/.local/lib/python2.7/site-packages (from google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0) (0.16.0)
Requirement already satisfied: ply==3.8 in /home/jenkins/.local/lib/python2.7/site-packages (from google-gax<0.16dev,>=0.15.7->gapic-google-cloud-pubsub-v1<0.16dev,>=0.15.0->google-cloud-pubsub==0.26.0->apache-beam==2.5.0.dev0) (3.8)
cheetah 2.4.4 requires Markdown>=2.0.1, which is not installed.
Installing collected packages: grpcio, hdfs, pytz, nose, apache-beam
  Found existing installation: grpcio 1.3.5
    Not uninstalling grpcio at /home/jenkins/.local/lib/python2.7/site-packages, outside environment <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/.env>
    Can't uninstall 'grpcio'. No files were found to uninstall.
  Found existing installation: apache-beam 2.5.0.dev0
    Not uninstalling apache-beam at /home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_TextIOIT/src/sdks/python, outside environment <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/.env>
    Can't uninstall 'apache-beam'. No files were found to uninstall.
  Running setup.py develop for apache-beam
Successfully installed apache-beam grpcio-1.11.0 hdfs-2.1.0 nose-1.3.7 pytz-2018.4
[beam_PerformanceTests_MongoDBIO_IT] $ /bin/bash -xe /tmp/jenkins7062263743164478820.sh
+ .env/bin/python PerfKitBenchmarker/pkb.py --project=apache-beam-testing --dpb_log_level=INFO --maven_binary=/home/jenkins/tools/maven/latest/bin/mvn --bigquery_table=beam_performance.mongodbioit_pkb_results --temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/> --official=true --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> --beam_it_timeout=1800 --benchmarks=beam_integration_benchmark --beam_it_profile=io-it --beam_prebuilt=true --beam_sdk=java --beam_it_module=sdks/java/io/mongodb --beam_it_class=org.apache.beam.sdk.io.mongodb.MongoDBIOIT '--beam_it_options=[--tempRoot=gs://temp-storage-for-perf-tests,--project=apache-beam-testing,--numberOfRecords=10000000]' --beam_kubernetes_scripts=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml> --beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/pkb-config.yml>
2018-04-16 06:00:48,875 87aaf736 MainThread INFO     Verbose logging to: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/87aaf736/pkb.log>
2018-04-16 06:00:48,881 87aaf736 MainThread INFO     PerfKitBenchmarker version: v1.12.0-489-gf0acf31
2018-04-16 06:00:48,882 87aaf736 MainThread INFO     Flag values:
--beam_it_class=org.apache.beam.sdk.io.mongodb.MongoDBIOIT
--beam_it_timeout=1800
--beam_sdk=java
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/>
--maven_binary=/home/jenkins/tools/maven/latest/bin/mvn
--beam_it_options=[--tempRoot=gs://temp-storage-for-perf-tests,--project=apache-beam-testing,--numberOfRecords=10000000]
--beam_prebuilt
--kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088>
--project=apache-beam-testing
--beam_kubernetes_scripts=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
--bigquery_table=beam_performance.mongodbioit_pkb_results
--official
--beam_it_module=sdks/java/io/mongodb
--dpb_log_level=INFO
--beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/pkb-config.yml>
--beam_it_profile=io-it
--benchmarks=beam_integration_benchmark
2018-04-16 06:00:49,149 87aaf736 MainThread WARNING  The key "flags" was not in the default config, but was in user overrides. This may indicate a typo.
2018-04-16 06:00:49,150 87aaf736 MainThread INFO     Initializing the edw service decoder
2018-04-16 06:00:49,290 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Provisioning resources for benchmark beam_integration_benchmark
2018-04-16 06:00:49,293 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Preparing benchmark beam_integration_benchmark
2018-04-16 06:00:49,294 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: git clone https://github.com/apache/beam.git
2018-04-16 06:01:17,844 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> create -f <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
2018-04-16 06:01:18,076 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running benchmark beam_integration_benchmark
2018-04-16 06:01:18,080 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:01:28,219 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:01:38,345 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:01:48,491 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:01:58,637 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:02:08,851 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:02:18,997 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:02:29,145 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:02:39,288 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:02:49,430 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:02:59,575 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:03:09,718 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:03:19,864 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:03:30,011 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:03:40,144 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:03:50,285 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:04:00,421 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:04:10,558 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:04:20,716 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> get svc mongo-load-balancer-service -ojsonpath={.status.loadBalancer.ingress[0].ip}
2018-04-16 06:04:20,850 87aaf736 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-04-16 06:04:20,851 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-04-16 06:04:20,851 87aaf736 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523854877088> delete -f <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
2018-04-16 06:04:21,110 87aaf736 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 757, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 148, in Run
    dynamic_pipeline_options)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 100, in GenerateAllPipelineOptions
    EvaluateDynamicPipelineOptions(dynamic_pipeline_options))
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 67, in EvaluateDynamicPipelineOptions
    argValue = RetrieveLoadBalancerIp(optionDescriptor)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/beam_pipeline_options.py",> line 153, in RetrieveLoadBalancerIp
    raise "Could not retrieve LoadBalancer IP address"
TypeError: exceptions must be old-style classes or derived from BaseException, not str
2018-04-16 06:04:21,111 87aaf736 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-04-16 06:04:21,156 87aaf736 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-04-16 06:04:21,157 87aaf736 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/87aaf736/pkb.log>
2018-04-16 06:04:21,157 87aaf736 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/87aaf736/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Jenkins build is back to normal : beam_PerformanceTests_MongoDBIO_IT #62

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/62/display/redirect?page=changes>


Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #61

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/61/display/redirect>

------------------------------------------
[...truncated 56.97 KB...]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.225.99.224:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:772)
	at com.mongodb.Mongo$2.execute(Mongo.java:759)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
	at com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:77)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.225.99.224:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:772)
	at com.mongodb.Mongo$2.execute(Mongo.java:759)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
	at com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:77)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.225.99.224:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:772)
	at com.mongodb.Mongo$2.execute(Mongo.java:759)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
	at com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:77)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Workflow failed. Causes: S03:Read all documents/Read(BoundedMongoDbSource)+Map documents to Strings/Map+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write failed.
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
	at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:127)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   MongoDBIOIT.testWriteAndRead:127  Runtime com.mongodb.MongoTimeoutException: ...
[INFO] 
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ beam-sdks-java-io-mongodb ---
[INFO] No dependency problems found
[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:verify (default) @ beam-sdks-java-io-mongodb ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/86229efd/beam/sdks/java/io/mongodb/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:12 min
[INFO] Finished at: 2018-04-17T12:16:04Z
[INFO] Final Memory: 93M/1252M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/86229efd/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/86229efd/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/86229efd/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.plugin.surefire.SurefireHelper.throwException (SurefireHelper.java:240)
    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution (SurefireHelper.java:112)
    at org.apache.maven.plugin.failsafe.VerifyMojo.execute (VerifyMojo.java:192)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

STDERR: 
2018-04-17 12:16:05,411 86229efd MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 644, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-17 12:16:05,412 86229efd MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-04-17 12:16:05,413 86229efd MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523959299870> delete -f <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
2018-04-17 12:16:05,928 86229efd MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 778, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 644, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-17 12:16:05,928 86229efd MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-04-17 12:16:05,952 86229efd MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-04-17 12:16:05,953 86229efd MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/86229efd/pkb.log>
2018-04-17 12:16:05,953 86229efd MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/86229efd/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #60

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/60/display/redirect?page=changes>

Changes:

[apilloud] [SQL] Remove PRIMARY KEY it does nothing

[apilloud] [SQL] Plumb through column nullable field

[apilloud] [SQL] Copy in DDL code from Calicite 1.16

[apilloud] [SQL] Patch ddl code for beam

[ehudm] Normalize Filesystems.match() glob behavior.

[amyrvold] Fix failing nightly release build

------------------------------------------
[...truncated 63.43 KB...]
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.225.162.7:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.225.162.7:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.225.162.7:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
Workflow failed. Causes: S01:Generate sequence/Read(BoundedCountingSource)+Produce documents/Map+Write documents to MongoDB/ParDo(Write) failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  mongodbioit0testwriteandr-04162303-7q3n-harness-7hn1,
  mongodbioit0testwriteandr-04162303-7q3n-harness-7hn1,
  mongodbioit0testwriteandr-04162303-7q3n-harness-7hn1,
  mongodbioit0testwriteandr-04162303-7q3n-harness-7hn1
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
	at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:114)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

[ERROR] testWriteAndRead(org.apache.beam.sdk.io.mongodb.MongoDBIOIT)  Time elapsed: 0 s  <<< ERROR!
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.225.162.7:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:158)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:133)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:128)
	at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:51)
	at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:36)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:136)
	at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.tearDown(MongoDBIOIT.java:99)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR] org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(org.apache.beam.sdk.io.mongodb.MongoDBIOIT)
[ERROR]   Run 1: MongoDBIOIT.testWriteAndRead:114  Runtime com.mongodb.MongoSocketReadExceptio...
[ERROR]   Run 2: MongoDBIOIT.tearDown:99  MongoTimeout Timed out after 30000 ms while waiting ...
[INFO] 
[INFO] 
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ beam-sdks-java-io-mongodb ---
[INFO] No dependency problems found
[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:verify (default) @ beam-sdks-java-io-mongodb ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/a7762df9/beam/sdks/java/io/mongodb/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10:25 min
[INFO] Finished at: 2018-04-17T06:13:41Z
[INFO] Final Memory: 93M/1247M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/a7762df9/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/a7762df9/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/a7762df9/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.plugin.surefire.SurefireHelper.throwException (SurefireHelper.java:240)
    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution (SurefireHelper.java:112)
    at org.apache.maven.plugin.failsafe.VerifyMojo.execute (VerifyMojo.java:192)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

STDERR: 
2018-04-17 06:13:42,753 a7762df9 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 644, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-17 06:13:42,754 a7762df9 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-04-17 06:13:42,755 a7762df9 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523941316285> delete -f <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
2018-04-17 06:13:43,210 a7762df9 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 778, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 644, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-17 06:13:43,211 a7762df9 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-04-17 06:13:43,254 a7762df9 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-04-17 06:13:43,255 a7762df9 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/a7762df9/pkb.log>
2018-04-17 06:13:43,256 a7762df9 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/a7762df9/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #59

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/59/display/redirect?page=changes>

Changes:

[swegner] Fix a typo in gradle task group

[mingmxu] support MAP in SQL schema

[mingmxu] in MAP, key as primitive, and value can be primitive/array/map/row

[mingmxu] use Collection for ARRAY type, and re-org `verify` code in `Row`

[mingmxu] rebase as file conflict with #5089

[github] Update containers at master to newly released beam-master-20180413.

[mingmxu] rename CollectionType to CollectionElementType

[github] Add region to dataflowOptions struct.

[sidhom] [BEAM-4056] Identify side inputs by transform id and local name

[sidhom] Add side input assertions to ExecutableStageMatcher

------------------------------------------
[...truncated 55.18 KB...]
	at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:43)
	at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:48)
	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:200)
	at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:158)
	at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:75)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.226.169.59:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.226.169.59:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.226.169.59:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.226.169.59:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.226.169.59:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:219)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:168)
	at com.mongodb.operation.MixedBulkWriteOperation.execute(MixedBulkWriteOperation.java:74)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:323)
	at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:311)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.flush(MongoDbIO.java:667)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$Write$WriteFn.processElement(MongoDbIO.java:652)
Workflow failed. Causes: S01:Generate sequence/Read(BoundedCountingSource)+Produce documents/Map+Write documents to MongoDB/ParDo(Write) failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: 
  mongodbioit0testwriteandr-04161708-tmtp-harness-fwlr,
  mongodbioit0testwriteandr-04161708-tmtp-harness-fwlr,
  mongodbioit0testwriteandr-04161708-tmtp-harness-fwlr,
  mongodbioit0testwriteandr-04161708-tmtp-harness-fwlr
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
	at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:114)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   MongoDBIOIT.testWriteAndRead:114  Runtime com.mongodb.MongoSocketReadExceptio...
[INFO] 
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ beam-sdks-java-io-mongodb ---
[INFO] No dependency problems found
[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:verify (default) @ beam-sdks-java-io-mongodb ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/d1eca819/beam/sdks/java/io/mongodb/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 09:43 min
[INFO] Finished at: 2018-04-17T00:17:32Z
[INFO] Final Memory: 93M/1248M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/d1eca819/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/d1eca819/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/d1eca819/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.plugin.surefire.SurefireHelper.throwException (SurefireHelper.java:240)
    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution (SurefireHelper.java:112)
    at org.apache.maven.plugin.failsafe.VerifyMojo.execute (VerifyMojo.java:192)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

STDERR: 
2018-04-17 00:17:32,723 d1eca819 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-17 00:17:32,724 d1eca819 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-04-17 00:17:32,724 d1eca819 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523922753154> delete -f <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
2018-04-17 00:17:33,310 d1eca819 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 757, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-17 00:17:33,310 d1eca819 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-04-17 00:17:33,343 d1eca819 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-04-17 00:17:33,344 d1eca819 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/d1eca819/pkb.log>
2018-04-17 00:17:33,344 d1eca819 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/d1eca819/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #58

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/58/display/redirect?page=changes>

Changes:

[tgroh] Update Dataflow Development Container Version

[github] Add region to dataflowOptions as well.

[tgroh] Use Explicit PipelineOptions in Native Evaluators

------------------------------------------
[...truncated 53.20 KB...]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.224.18.180:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:772)
	at com.mongodb.Mongo$2.execute(Mongo.java:759)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
	at com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:75)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.224.18.180:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:772)
	at com.mongodb.Mongo$2.execute(Mongo.java:759)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
	at com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:75)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.224.18.180:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:772)
	at com.mongodb.Mongo$2.execute(Mongo.java:759)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
	at com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:75)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Workflow failed. Causes: S03:Read all documents/Read(BoundedMongoDbSource)+Map documents to Strings/Map+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write failed.
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
	at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:127)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   MongoDBIOIT.testWriteAndRead:127  Runtime com.mongodb.MongoTimeoutException: ...
[INFO] 
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ beam-sdks-java-io-mongodb ---
[INFO] No dependency problems found
[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:verify (default) @ beam-sdks-java-io-mongodb ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/eca035d9/beam/sdks/java/io/mongodb/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11:54 min
[INFO] Finished at: 2018-04-16T18:14:02Z
[INFO] Final Memory: 94M/1024M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/eca035d9/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/eca035d9/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/eca035d9/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.plugin.surefire.SurefireHelper.throwException (SurefireHelper.java:240)
    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution (SurefireHelper.java:112)
    at org.apache.maven.plugin.failsafe.VerifyMojo.execute (VerifyMojo.java:192)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

STDERR: 
2018-04-16 18:14:02,873 eca035d9 MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-16 18:14:02,875 eca035d9 MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-04-16 18:14:02,875 eca035d9 MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523890886837> delete -f <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
2018-04-16 18:14:03,658 eca035d9 MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 757, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-16 18:14:03,659 eca035d9 MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-04-16 18:14:03,720 eca035d9 MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-04-16 18:14:03,720 eca035d9 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/eca035d9/pkb.log>
2018-04-16 18:14:03,721 eca035d9 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/eca035d9/completion_statuses.json>
Build step 'Execute shell' marked build as failure

Build failed in Jenkins: beam_PerformanceTests_MongoDBIO_IT #57

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/57/display/redirect>

------------------------------------------
[...truncated 55.79 KB...]
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:772)
	at com.mongodb.Mongo$2.execute(Mongo.java:759)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
	at com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:75)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches ReadPreferenceServerSelector{readPreference=primary}. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.224.189.206:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
	at com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
	at com.mongodb.Mongo.execute(Mongo.java:772)
	at com.mongodb.Mongo$2.execute(Mongo.java:759)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
	at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
	at org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
	at com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
	at com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:75)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:381)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:353)
	at com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:284)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
	at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Workflow failed. Causes: S03:Read all documents/Read(BoundedMongoDbSource)+Map documents to Strings/Map+Calculate hashcode/WithKeys/AddKeys/Map+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate hashcode/Combine.perKey(Hashing)/GroupByKey/Write failed.
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
	at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:346)
	at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:328)
	at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:127)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

[ERROR] testWriteAndRead(org.apache.beam.sdk.io.mongodb.MongoDBIOIT)  Time elapsed: 0 s  <<< ERROR!
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches WritableServerSelector. Client view of cluster state is {type=UNKNOWN, servers=[{address=35.224.189.206:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
	at com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
	at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
	at com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
	at com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:158)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:133)
	at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:128)
	at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:51)
	at com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:36)
	at com.mongodb.Mongo.execute(Mongo.java:781)
	at com.mongodb.Mongo$2.execute(Mongo.java:764)
	at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:136)
	at org.apache.beam.sdk.io.mongodb.MongoDBIOIT.tearDown(MongoDBIOIT.java:99)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.apache.beam.sdk.testing.TestPipeline$1.evaluate(TestPipeline.java:317)
	at org.junit.rules.RunRules.evaluate(RunRules.java:20)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
	at org.apache.maven.surefire.junitcore.pc.Scheduler$1.run(Scheduler.java:410)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR] org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(org.apache.beam.sdk.io.mongodb.MongoDBIOIT)
[ERROR]   Run 1: MongoDBIOIT.testWriteAndRead:127  Runtime com.mongodb.MongoTimeoutException: ...
[ERROR]   Run 2: MongoDBIOIT.tearDown:99  MongoTimeout Timed out after 30000 ms while waiting ...
[INFO] 
[INFO] 
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-dependency-plugin:3.0.2:analyze-only (default) @ beam-sdks-java-io-mongodb ---
[INFO] No dependency problems found
[INFO] 
[INFO] --- maven-failsafe-plugin:2.21.0:verify (default) @ beam-sdks-java-io-mongodb ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/c3fe907d/beam/sdks/java/io/mongodb/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:01 min
[INFO] Finished at: 2018-04-16T12:16:14Z
[INFO] Final Memory: 97M/1145M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/c3fe907d/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.21.0:verify (default) on project beam-sdks-java-io-mongodb: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/c3fe907d/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: There are test failures.

Please refer to <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/c3fe907d/beam/sdks/java/io/mongodb/target/failsafe-reports> for the individual test results.
Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
    at org.apache.maven.plugin.surefire.SurefireHelper.throwException (SurefireHelper.java:240)
    at org.apache.maven.plugin.surefire.SurefireHelper.reportExecution (SurefireHelper.java:112)
    at org.apache.maven.plugin.failsafe.VerifyMojo.execute (VerifyMojo.java:192)
    at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154)
    at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
    at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
    at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51)
    at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309)
    at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194)
    at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107)
    at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955)
    at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290)
    at org.apache.maven.cli.MavenCli.main (MavenCli.java:194)
    at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke (Method.java:498)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289)
    at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229)
    at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415)
    at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356)
[ERROR] 
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException

STDERR: 
2018-04-16 12:16:15,282 c3fe907d MainThread beam_integration_benchmark(1/1) ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-16 12:16:15,283 c3fe907d MainThread beam_integration_benchmark(1/1) INFO     Cleaning up benchmark beam_integration_benchmark
2018-04-16 12:16:15,283 c3fe907d MainThread beam_integration_benchmark(1/1) INFO     Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1523872895032> delete -f <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
2018-04-16 12:16:15,800 c3fe907d MainThread beam_integration_benchmark(1/1) ERROR    Exception running benchmark
Traceback (most recent call last):
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 757, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 623, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 526, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run
    job_type=job_type)
  File "<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-04-16 12:16:15,800 c3fe907d MainThread beam_integration_benchmark(1/1) ERROR    Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue.
2018-04-16 12:16:15,832 c3fe907d MainThread INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                  
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-04-16 12:16:15,833 c3fe907d MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/c3fe907d/pkb.log>
2018-04-16 12:16:15,833 c3fe907d MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/c3fe907d/completion_statuses.json>
Build step 'Execute shell' marked build as failure