You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/05/09 12:05:21 UTC

Build failed in Jenkins: beam_PerformanceTests_Spark #1690

See <https://builds.apache.org/job/beam_PerformanceTests_Spark/1690/display/redirect>

------------------------------------------
[...truncated 23.83 KB...]
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@607b2792{/jobs/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@138a7441{/jobs/job,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3e598df9{/jobs/job/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@99a65d3{/stages,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@42cc13a0{/stages/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6813a331{/stages/stage,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@111610e6{/stages/stage/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@29d37757{/stages/pool,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@25cc7470{/stages/pool/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@79b663b3{/storage,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5d28bcd5{/storage/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@32639b12{/storage/rdd,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3887cf88{/storage/rdd/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@78dc4696{/environment,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5652f555{/environment/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@55120f99{/executors,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@38f2e97e{/executors/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@323659f8{/executors/threadDump,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3e521715{/executors/threadDump/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@265c5d69{/static,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1d2644e3{/,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@602c4656{/api,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@63998bf4{/jobs/job/kill,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@61942c1{/stages/stage/kill,null,AVAILABLE,@Spark}
2018-05-09 12:03:10 INFO  SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://10.128.0.5:4040
2018-05-09 12:03:10 INFO  SparkContext:54 - Added JAR file:/usr/lib/hadoop/hadoop-common.jar at spark://10.128.0.5:38853/jars/hadoop-common.jar with timestamp 1525867390316
2018-05-09 12:03:10 INFO  Utils:54 - Using initial executors = 10000, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
2018-05-09 12:03:11 INFO  GoogleHadoopFileSystemBase:648 - GHFS version: 1.6.5-hadoop2
2018-05-09 12:03:12 INFO  RMProxy:123 - Connecting to ResourceManager at pkb-b94b5018-m/10.128.0.5:8032
2018-05-09 12:03:13 INFO  Client:54 - Requesting a new application from cluster with 2 NodeManagers
2018-05-09 12:03:13 INFO  Client:54 - Verifying our application has not requested more than the maximum memory capability of the cluster (3072 MB per container)
2018-05-09 12:03:13 INFO  Client:54 - Will allocate AM container, with 1024 MB memory including 384 MB overhead
2018-05-09 12:03:13 INFO  Client:54 - Setting up container launch context for our AM
2018-05-09 12:03:13 INFO  Client:54 - Setting up the launch environment for our AM container
2018-05-09 12:03:13 INFO  Client:54 - Preparing resources for our AM container
2018-05-09 12:03:16 INFO  Client:54 - Uploading resource file:/usr/lib/spark/examples/jars/spark-examples.jar -> hdfs://pkb-b94b5018-m/user/root/.sparkStaging/application_1525867317396_0001/spark-examples.jar
2018-05-09 12:03:17 INFO  Client:54 - Uploading resource file:/hadoop/spark/tmp/spark-36ebcb2c-5a45-4259-9eac-11b20c87782a/__spark_conf__634130004453229685.zip -> hdfs://pkb-b94b5018-m/user/root/.sparkStaging/application_1525867317396_0001/__spark_conf__.zip
2018-05-09 12:03:17 WARN  DataStreamer:975 - Caught exception
java.lang.InterruptedException
	at java.lang.Object.wait(Native Method)
	at java.lang.Thread.join(Thread.java:1252)
	at java.lang.Thread.join(Thread.java:1326)
	at org.apache.hadoop.hdfs.DataStreamer.closeResponder(DataStreamer.java:973)
	at org.apache.hadoop.hdfs.DataStreamer.endBlock(DataStreamer.java:624)
	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:801)
2018-05-09 12:03:17 INFO  SecurityManager:54 - Changing view acls to: root
2018-05-09 12:03:17 INFO  SecurityManager:54 - Changing modify acls to: root
2018-05-09 12:03:17 INFO  SecurityManager:54 - Changing view acls groups to: 
2018-05-09 12:03:17 INFO  SecurityManager:54 - Changing modify acls groups to: 
2018-05-09 12:03:17 INFO  SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
2018-05-09 12:03:17 INFO  Client:54 - Submitting application application_1525867317396_0001 to ResourceManager
2018-05-09 12:03:18 INFO  YarnClientImpl:278 - Submitted application application_1525867317396_0001
2018-05-09 12:03:18 INFO  SchedulerExtensionServices:54 - Starting Yarn extension services with app application_1525867317396_0001 and attemptId None
2018-05-09 12:03:19 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:19 INFO  Client:54 - 
	 client token: N/A
	 diagnostics: [Wed May 09 12:03:18 +0000 2018] Scheduler has assigned a container for AM, waiting for AM container to be launched
	 ApplicationMaster host: N/A
	 ApplicationMaster RPC port: -1
	 queue: default
	 start time: 1525867398014
	 final status: UNDEFINED
	 tracking URL: http://pkb-b94b5018-m:8088/proxy/application_1525867317396_0001/
	 user: root
2018-05-09 12:03:20 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:21 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:22 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:23 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:24 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:25 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:26 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:27 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:28 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:29 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:30 INFO  Client:54 - Application report for application_1525867317396_0001 (state: ACCEPTED)
2018-05-09 12:03:30 INFO  YarnSchedulerBackend$YarnSchedulerEndpoint:54 - ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM)
2018-05-09 12:03:30 INFO  YarnClientSchedulerBackend:54 - Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> pkb-b94b5018-m, PROXY_URI_BASES -> http://pkb-b94b5018-m:8088/proxy/application_1525867317396_0001), /proxy/application_1525867317396_0001
2018-05-09 12:03:30 INFO  JettyUtils:54 - Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
2018-05-09 12:03:31 INFO  Client:54 - Application report for application_1525867317396_0001 (state: RUNNING)
2018-05-09 12:03:31 INFO  Client:54 - 
	 client token: N/A
	 diagnostics: N/A
	 ApplicationMaster host: 10.128.0.6
	 ApplicationMaster RPC port: 0
	 queue: default
	 start time: 1525867398014
	 final status: UNDEFINED
	 tracking URL: http://pkb-b94b5018-m:8088/proxy/application_1525867317396_0001/
	 user: root
2018-05-09 12:03:31 INFO  YarnClientSchedulerBackend:54 - Application application_1525867317396_0001 has started running.
2018-05-09 12:03:31 INFO  Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 57602.
2018-05-09 12:03:31 INFO  NettyBlockTransferService:54 - Server created on 10.128.0.5:57602
2018-05-09 12:03:31 INFO  BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2018-05-09 12:03:31 INFO  BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 10.128.0.5, 57602, None)
2018-05-09 12:03:31 INFO  BlockManagerMasterEndpoint:54 - Registering block manager 10.128.0.5:57602 with 376.8 MB RAM, BlockManagerId(driver, 10.128.0.5, 57602, None)
2018-05-09 12:03:31 INFO  BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 10.128.0.5, 57602, None)
2018-05-09 12:03:31 INFO  BlockManager:54 - external shuffle service port = 7337
2018-05-09 12:03:31 INFO  BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 10.128.0.5, 57602, None)
2018-05-09 12:03:31 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@77896335{/metrics/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:32 INFO  EventLoggingListener:54 - Logging events to hdfs://pkb-b94b5018-m/user/spark/eventlog/application_1525867317396_0001
2018-05-09 12:03:32 INFO  Utils:54 - Using initial executors = 10000, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances
2018-05-09 12:03:32 INFO  YarnClientSchedulerBackend:54 - SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
2018-05-09 12:03:32 INFO  SharedState:54 - loading hive config file: file:/etc/hive/conf.dist/hive-site.xml
2018-05-09 12:03:32 INFO  SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/tmp/dddbf056-49c7-4c35-808a-effe958424a2/spark-warehouse').
2018-05-09 12:03:32 INFO  SharedState:54 - Warehouse path is 'file:/tmp/dddbf056-49c7-4c35-808a-effe958424a2/spark-warehouse'.
2018-05-09 12:03:32 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5599b5bb{/SQL,null,AVAILABLE,@Spark}
2018-05-09 12:03:32 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@4264beb8{/SQL/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:32 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7ead1d80{/SQL/execution,null,AVAILABLE,@Spark}
2018-05-09 12:03:32 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@1182413a{/SQL/execution/json,null,AVAILABLE,@Spark}
2018-05-09 12:03:32 INFO  ContextHandler:781 - Started o.s.j.s.ServletContextHandler@362a561e{/static/sql,null,AVAILABLE,@Spark}
2018-05-09 12:03:33 INFO  StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint
2018-05-09 12:03:36 WARN  GoogleHadoopFileSystemBase:75 - GHFS.configureBuckets: Warning. No GCS bucket provided. Falling back on deprecated fs.gs.system.bucket.
Exception in thread "main" org.apache.spark.sql.AnalysisException: Path does not exist: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/etc/hosts;
	at org.apache.spark.sql.execution.datasources.DataSource$.org$apache$spark$sql$execution$datasources$DataSource$$checkAndGlobPathIfNecessary(DataSource.scala:626)
	at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:350)
	at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:350)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
	at scala.collection.immutable.List.flatMap(List.scala:344)
	at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:349)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
	at org.apache.spark.sql.DataFrameReader.text(DataFrameReader.scala:623)
	at org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:657)
	at org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:632)
	at org.apache.spark.examples.JavaWordCount.main(JavaWordCount.java:45)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
2018-05-09 12:03:37 INFO  SparkContext:54 - Invoking stop() from shutdown hook
2018-05-09 12:03:37 INFO  AbstractConnector:310 - Stopped Spark@5edf401a{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-05-09 12:03:37 INFO  SparkUI:54 - Stopped Spark web UI at http://10.128.0.5:4040
2018-05-09 12:03:37 INFO  YarnClientSchedulerBackend:54 - Interrupting monitor thread
2018-05-09 12:03:37 INFO  YarnClientSchedulerBackend:54 - Shutting down all executors
2018-05-09 12:03:37 INFO  YarnSchedulerBackend$YarnDriverEndpoint:54 - Asking each executor to shut down
2018-05-09 12:03:37 INFO  SchedulerExtensionServices:54 - Stopping SchedulerExtensionServices
(serviceOption=None,
 services=List(),
 started=false)
2018-05-09 12:03:37 INFO  YarnClientSchedulerBackend:54 - Stopped
2018-05-09 12:03:37 INFO  MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped!
2018-05-09 12:03:37 INFO  MemoryStore:54 - MemoryStore cleared
2018-05-09 12:03:37 INFO  BlockManager:54 - BlockManager stopped
2018-05-09 12:03:37 INFO  BlockManagerMaster:54 - BlockManagerMaster stopped
2018-05-09 12:03:37 INFO  OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped!
2018-05-09 12:03:37 INFO  SparkContext:54 - Successfully stopped SparkContext
2018-05-09 12:03:37 INFO  ShutdownHookManager:54 - Shutdown hook called
2018-05-09 12:03:37 INFO  ShutdownHookManager:54 - Deleting directory /hadoop/spark/tmp/spark-36ebcb2c-5a45-4259-9eac-11b20c87782a
ERROR: (gcloud.dataproc.jobs.submit.spark) Job [dddbf056-49c7-4c35-808a-effe958424a2] entered state [ERROR] while waiting for [DONE].

2018-05-09 12:03:42,888 b94b5018 MainThread dpb_wordcount_benchmark(1/1) INFO     Cleaning up benchmark dpb_wordcount_benchmark
2018-05-09 12:03:42,888 b94b5018 MainThread dpb_wordcount_benchmark(1/1) INFO     Tearing down resources for benchmark dpb_wordcount_benchmark
2018-05-09 12:03:42,889 b94b5018 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters delete pkb-b94b5018 --format json --quiet
2018-05-09 12:05:14,890 b94b5018 MainThread dpb_wordcount_benchmark(1/1) INFO     Running: gcloud dataproc clusters describe pkb-b94b5018 --format json --quiet
2018-05-09 12:05:15,581 b94b5018 MainThread dpb_wordcount_benchmark(1/1) INFO     Ran: {gcloud dataproc clusters describe pkb-b94b5018 --format json --quiet}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-b94b5018

2018-05-09 12:05:15,644 b94b5018 MainThread INFO     Publishing 2 samples to <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b94b5018/perfkit-bq-pub7d4LTY.json>
2018-05-09 12:05:15,645 b94b5018 MainThread INFO     Publishing 2 samples to beam_performance.spark_pkp_results
2018-05-09 12:05:15,645 b94b5018 MainThread INFO     Running: bq load --autodetect --source_format=NEWLINE_DELIMITED_JSON beam_performance.spark_pkp_results <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b94b5018/perfkit-bq-pub7d4LTY.json>
2018-05-09 12:05:20,932 b94b5018 MainThread INFO     
-------------------------PerfKitBenchmarker Complete Results-------------------------
{'metadata': {'dpb_cluster_id': 'pkb-b94b5018',
              'dpb_cluster_shape': 'n1-standard-1',
              'dpb_cluster_size': 2,
              'dpb_service': 'dataproc',
              'input_location': 'gs:///etc/hosts',
              'perfkitbenchmarker_version': 'v1.12.0-580-g36cfc82',
              'run_number': 0},
 'metric': 'run_time',
 'official': True,
 'owner': 'jenkins',
 'product_name': 'PerfKitBenchmarker',
 'run_uri': 'b94b5018-5fc5e87d-e254-4498-83a9-61720041e21f',
 'sample_uri': 'bc9e86e7-be61-49f9-997d-88d65b2ef165',
 'test': 'dpb_wordcount_benchmark',
 'timestamp': 1525867422.887806,
 'unit': 'seconds',
 'value': 39.731362}
{'metadata': {'perfkitbenchmarker_version': 'v1.12.0-580-g36cfc82'},
 'metric': 'End to End Runtime',
 'official': True,
 'owner': 'jenkins',
 'product_name': 'PerfKitBenchmarker',
 'run_uri': 'b94b5018-5fc5e87d-e254-4498-83a9-61720041e21f',
 'sample_uri': 'e9334aca-dbbf-40f8-9943-53d4b4460d06',
 'test': 'dpb_wordcount_benchmark',
 'timestamp': 1525867515.58256,
 'unit': 'seconds',
 'value': 276.9262421131134}


-------------------------PerfKitBenchmarker Results Summary-------------------------
DPB_WORDCOUNT_BENCHMARK:
  dpb_cluster_id="pkb-b94b5018" dpb_cluster_shape="n1-standard-1" dpb_cluster_size="2" dpb_service="dataproc" input_location="gs:///etc/hosts" run_number="0"
  run_time                             39.731362 seconds                       
  End to End Runtime                  276.926242 seconds                       

-------------------------
For all tests: perfkitbenchmarker_version="v1.12.0-580-g36cfc82"
2018-05-09 12:05:20,933 b94b5018 MainThread INFO     Publishing 2 samples to <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b94b5018/perfkitbenchmarker_results.json>
2018-05-09 12:05:20,933 b94b5018 MainThread INFO     Benchmark run statuses:
------------------------------------------------------------------------------
Name                     UID                       Status     Failed Substatus
------------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  SUCCEEDED                  
------------------------------------------------------------------------------
Success rate: 100.00% (1/1)
2018-05-09 12:05:20,933 b94b5018 MainThread INFO     Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b94b5018/pkb.log>
2018-05-09 12:05:20,934 b94b5018 MainThread INFO     Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/b94b5018/completion_statuses.json>
[Set GitHub commit status (universal)] SUCCESS on repos [GHRepository@7490e3ce[description=Apache Beam,homepage=,name=beam,fork=false,size=58754,milestones={},language=Java,commits={},source=<null>,parent=<null>,responseHeaderFields={null=[HTTP/1.1 200 OK], Access-Control-Allow-Origin=[*], Access-Control-Expose-Headers=[ETag, Link, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval], Cache-Control=[private, max-age=60, s-maxage=60], Content-Encoding=[gzip], Content-Security-Policy=[default-src 'none'], Content-Type=[application/json; charset=utf-8], Date=[Wed, 09 May 2018 12:05:21 GMT], ETag=[W/"c9c57ab721dc1f578ccd914bb574f9f9"], Last-Modified=[Wed, 09 May 2018 10:15:34 GMT], OkHttp-Received-Millis=[1525867521686], OkHttp-Response-Source=[NETWORK 200], OkHttp-Selected-Protocol=[http/1.1], OkHttp-Sent-Millis=[1525867521529], Referrer-Policy=[origin-when-cross-origin, strict-origin-when-cross-origin], Server=[GitHub.com], Status=[200 OK], Strict-Transport-Security=[max-age=31536000; includeSubdomains; preload], Transfer-Encoding=[chunked], Vary=[Accept, Authorization, Cookie, X-GitHub-OTP], X-Accepted-OAuth-Scopes=[repo], X-Content-Type-Options=[nosniff], X-Frame-Options=[deny], X-GitHub-Media-Type=[github.v3; format=json], X-GitHub-Request-Id=[9488:4F3F:7C23A5:1265BA4:5AF2E3EC], X-OAuth-Scopes=[admin:repo_hook, repo, repo:status], X-RateLimit-Limit=[5000], X-RateLimit-Remaining=[4999], X-RateLimit-Reset=[1525871121], X-Runtime-rack=[0.075117], X-XSS-Protection=[1; mode=block]},url=https://api.github.com/repos/apache/beam,id=50904245]] (sha:60f90c8) with context:beam_PerformanceTests_Spark
Setting commit status on GitHub for https://github.com/apache/beam/commit/60f90c8dcb229c35a82c7be15e64a89678bae058
ERROR: Build step failed with exception
java.io.FileNotFoundException: https://api.github.com/repos/apache/beam/statuses/60f90c8dcb229c35a82c7be15e64a89678bae058
	at com.squareup.okhttp.internal.huc.HttpURLConnectionImpl.getInputStream(HttpURLConnectionImpl.java:243)
	at com.squareup.okhttp.internal.huc.DelegatingHttpsURLConnection.getInputStream(DelegatingHttpsURLConnection.java:210)
	at com.squareup.okhttp.internal.huc.HttpsURLConnectionImpl.getInputStream(HttpsURLConnectionImpl.java:25)
	at org.kohsuke.github.Requester.parse(Requester.java:612)
	at org.kohsuke.github.Requester.parse(Requester.java:594)
	at org.kohsuke.github.Requester._to(Requester.java:272)
Caused: org.kohsuke.github.GHFileNotFoundException: {"message":"Not Found","documentation_url":"https://developer.github.com/v3/repos/statuses/#create-a-status"}
	at org.kohsuke.github.Requester.handleApiError(Requester.java:686)
	at org.kohsuke.github.Requester._to(Requester.java:293)
	at org.kohsuke.github.Requester.to(Requester.java:234)
	at org.kohsuke.github.GHRepository.createCommitStatus(GHRepository.java:1075)
	at org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:160)
Caused: org.jenkinsci.plugins.github.common.CombineErrorHandler$ErrorHandlingException
	at org.jenkinsci.plugins.github.common.CombineErrorHandler.handle(CombineErrorHandler.java:74)
	at org.jenkinsci.plugins.github.status.GitHubCommitStatusSetter.perform(GitHubCommitStatusSetter.java:164)
	at com.cloudbees.jenkins.GitHubCommitNotifier.perform(GitHubCommitNotifier.java:151)
	at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:81)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:744)
	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:690)
	at hudson.model.Build$BuildExecution.post2(Build.java:186)
	at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:635)
	at hudson.model.Run.execute(Run.java:1749)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:97)
	at hudson.model.Executor.run(Executor.java:429)
Build step 'Set build status on GitHub commit [deprecated]' marked build as failure

Jenkins build is back to normal : beam_PerformanceTests_Spark #1691

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PerformanceTests_Spark/1691/display/redirect?page=changes>