You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/11/12 11:27:45 UTC
Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Spark #1036
See <https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/1036/display/redirect?page=changes>
Changes:
[github] Update built-in.md
------------------------------------------
[...truncated 892.23 KB...]
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 ERROR org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while deleting file /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/28/temp_shuffle_dc491e52-da7c-4351-86d1-f798c0dbae5a
18/11/12 11:27:42 ERROR org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while reverting partial writes to file /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/07/temp_shuffle_7dd78bed-2e4a-43f2-a562-c49f16bf7cb3
java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/07/temp_shuffle_7dd78bed-2e4a-43f2-a562-c49f16bf7cb3 (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
at org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 ERROR org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while deleting file /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/07/temp_shuffle_7dd78bed-2e4a-43f2-a562-c49f16bf7cb3
18/11/12 11:27:42 ERROR org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while reverting partial writes to file /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/15/temp_shuffle_43301066-7eab-4f2b-ba0a-6bdd1345d946
java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/15/temp_shuffle_43301066-7eab-4f2b-ba0a-6bdd1345d946 (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
at org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 ERROR org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while deleting file /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/15/temp_shuffle_43301066-7eab-4f2b-ba0a-6bdd1345d946
18/11/12 11:27:42 ERROR org.apache.spark.storage.DiskBlockObjectWriter: Uncaught exception while reverting partial writes to file /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/02/temp_shuffle_7c7fa744-d90e-421b-a46e-32c5c59bc220
java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/02/temp_shuffle_7c7fa744-d90e-421b-a46e-32c5c59bc220 (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter$$anonfun$revertPartialWritesAndClose$2.apply$mcV$sp(DiskBlockObjectWriter.scala:217)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1390)
at org.apache.spark.storage.DiskBlockObjectWriter.revertPartialWritesAndClose(DiskBlockObjectWriter.scala:214)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.stop(BypassMergeSortShuffleWriter.java:237)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:102)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 ERROR org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter: Error while deleting file /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/02/temp_shuffle_7c7fa744-d90e-421b-a46e-32c5c59bc220
18/11/12 11:27:42 ERROR org.apache.spark.executor.Executor: Exception in task 46.0 in stage 0.0 (TID 46)
java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/3f/temp_shuffle_bc416c9b-bcb3-4c65-899a-f0df16c0fa6f (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter.initialize(DiskBlockObjectWriter.scala:103)
at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:116)
at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:237)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 ERROR org.apache.spark.executor.Executor: Exception in task 47.0 in stage 0.0 (TID 47)
java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/1d/temp_shuffle_6bec34a8-e407-44d0-af66-4204fb9ea9b9 (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter.initialize(DiskBlockObjectWriter.scala:103)
at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:116)
at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:237)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 46.0 in stage 0.0 (TID 46, localhost, executor driver): java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/3f/temp_shuffle_bc416c9b-bcb3-4c65-899a-f0df16c0fa6f (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter.initialize(DiskBlockObjectWriter.scala:103)
at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:116)
at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:237)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 ERROR org.apache.spark.scheduler.TaskSetManager: Task 46 in stage 0.0 failed 1 times; aborting job
18/11/12 11:27:42 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 47.0 in stage 0.0 (TID 47, localhost, executor driver): java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/1d/temp_shuffle_6bec34a8-e407-44d0-af66-4204fb9ea9b9 (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at org.apache.spark.storage.DiskBlockObjectWriter.initialize(DiskBlockObjectWriter.scala:103)
at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:116)
at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:237)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/11/12 11:27:42 WARN org.apache.spark.storage.BlockManager: Putting block rdd_0_49 failed due to exception org.apache.spark.TaskKilledException.
18/11/12 11:27:42 WARN org.apache.spark.storage.BlockManager: Putting block rdd_0_48 failed due to exception org.apache.spark.TaskKilledException.
18/11/12 11:27:42 WARN org.apache.spark.storage.BlockManager: Block rdd_0_48 could not be removed as it was not found on disk or in memory
18/11/12 11:27:42 WARN org.apache.spark.storage.BlockManager: Block rdd_0_49 could not be removed as it was not found on disk or in memory
18/11/12 11:27:42 WARN org.apache.spark.storage.BlockManager: Putting block rdd_15_48 failed due to exception org.apache.spark.TaskKilledException.
18/11/12 11:27:42 WARN org.apache.spark.storage.BlockManager: Putting block rdd_15_49 failed due to exception org.apache.spark.TaskKilledException.
18/11/12 11:27:42 WARN org.apache.spark.storage.BlockManager: Block rdd_15_48 could not be removed as it was not found on disk or in memory
18/11/12 11:27:42 WARN org.apache.spark.storage.BlockManager: Block rdd_15_49 could not be removed as it was not found on disk or in memory
2018-11-12T11:27:42.539Z Running query:AVERAGE_SELLING_PRICE_BY_SELLER; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
2018-11-12T11:27:42.539Z skipping since configuration is not implemented
2018-11-12T11:27:42.539Z Running query:HIGHEST_BID; exportSummaryToBigQuery:true; streamTimeout:60
==========================================================================================
Run started 2018-11-12T11:27:04.746Z and ran for PT37.793S
Default configuration:
{"debug":true,"query":null,"sourceType":"DIRECT","sinkType":"DEVNULL","exportSummaryToBigQuery":false,"pubSubMode":"COMBINED","sideInputType":"DIRECT","sideInputRowCount":500,"sideInputNumShards":3,"sideInputUrl":null,"numEvents":100000,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":10000,"nextEventRate":10000,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":100000,"usePubsubPublishTime":false,"outOfOrderGroupSize":1}
Configurations:
Conf Description
0000 query:PASSTHROUGH; exportSummaryToBigQuery:true; streamTimeout:60
0001 query:CURRENCY_CONVERSION; exportSummaryToBigQuery:true; streamTimeout:60
0002 query:SELECTION; exportSummaryToBigQuery:true; streamTimeout:60
0003 query:LOCAL_ITEM_SUGGESTION; exportSummaryToBigQuery:true; streamTimeout:60
Exception in thread "main" 0004 query:AVERAGE_PRICE_FOR_CATEGORY; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
0005 query:HOT_ITEMS; exportSummaryToBigQuery:true; streamTimeout:60
0006 query:AVERAGE_SELLING_PRICE_BY_SELLER; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
0007 query:HIGHEST_BID; exportSummaryToBigQuery:true; streamTimeout:60
0008 query:MONITOR_NEW_USERS; exportSummaryToBigQuery:true; streamTimeout:60
0009 query:WINNING_BIDS; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
0010 query:LOG_TO_SHARDED_FILES; exportSummaryToBigQuery:true; streamTimeout:60
java.lang.RuntimeException: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/3f/temp_shuffle_bc416c9b-bcb3-4c65-899a-f0df16c0fa6f (No such file or directory)
at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:147)
at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:116)
at org.apache.beam.sdk.nexmark.Main.main(Main.java:480) 0011 query:USER_SESSIONS; exportSummaryToBigQuery:true; streamTimeout:60
0012 query:PROCESSING_TIME_WINDOWS; exportSummaryToBigQuery:true; streamTimeout:60
0013 query:BOUNDED_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; streamTimeout:60
Performance:
Conf Runtime(sec) (Baseline) Events(/sec) (Baseline) Results (Baseline)
0000 2.7 37622.3 92000
0001 1.7 59594.8 92000
0002 1.5 68681.3 351
0003 5.6 17799.9 580
0004 3.5 2826.5 4
0005 *** not run ***
0006 *** not run ***
0007 *** not run ***
0008 *** not run ***
0009 *** not run ***
0010 *** not run ***
0011 *** not run ***
0012 *** not run ***
0013 *** not run ***
==========================================================================================
2018-11-12T11:27:42.540Z skipping since configuration is not implemented
2018-11-12T11:27:42.540Z Running query:MONITOR_NEW_USERS; exportSummaryToBigQuery:true; streamTimeout:60
2018-11-12T11:27:42.540Z skipping since configuration is not implemented
2018-11-12T11:27:42.540Z Running query:WINNING_BIDS; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
Caused by: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/3f/temp_shuffle_bc416c9b-bcb3-4c65-899a-f0df16c0fa6f (No such file or directory)
at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:67)
at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:98)
at org.apache.beam.sdk.nexmark.NexmarkLauncher.run(NexmarkLauncher.java:1163)
at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:107)
at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:96)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.FileNotFoundException: /tmp/blockmgr-96ecb829-7a5d-494d-b0af-f72f707dea23/3f/temp_shuffle_bc416c9b-bcb3-4c65-899a-f0df16c0fa6f (No such file or directory)
at java.io.FileOutputStream.open0(Native Method)
2018-11-12T11:27:42.542Z at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
skipping since configuration is not implemented
at org.apache.spark.storage.DiskBlockObjectWriter.initialize(DiskBlockObjectWriter.scala:103)
2018-11-12T11:27:42.542Z Running query:LOG_TO_SHARDED_FILES; exportSummaryToBigQuery:true; streamTimeout:60
at org.apache.spark.storage.DiskBlockObjectWriter.open(DiskBlockObjectWriter.scala:116)
at org.apache.spark.storage.DiskBlockObjectWriter.write(DiskBlockObjectWriter.scala:237)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:151)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
2018-11-12T11:27:42.542Z skipping since configuration is not implemented
at org.apache.spark.scheduler.Task.run(Task.scala:109)
2018-11-12T11:27:42.542Z Running query:USER_SESSIONS; exportSummaryToBigQuery:true; streamTimeout:60 at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
... 3 more
2018-11-12T11:27:42.543Z skipping since configuration is not implemented
2018-11-12T11:27:42.543Z Running query:PROCESSING_TIME_WINDOWS; exportSummaryToBigQuery:true; streamTimeout:60
2018-11-12T11:27:42.543Z skipping since configuration is not implemented
2018-11-12T11:27:42.543Z Running query:BOUNDED_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; streamTimeout:60
2018-11-12T11:27:42.543Z skipping since configuration is not implemented
18/11/12 11:27:42 ERROR org.apache.spark.util.Utils: Uncaught exception in thread Executor task launch worker for task 50
java.lang.NullPointerException
at org.apache.spark.scheduler.Task$$anonfun$run$1.apply$mcV$sp(Task.scala:130)
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1361)
at org.apache.spark.scheduler.Task.run(Task.scala:128)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
> Task :beam-sdks-java-nexmark:run FAILED
:beam-sdks-java-nexmark:run (Thread[Daemon worker,5,main]) completed. Took 40.593 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':beam-sdks-java-nexmark:run'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_191/bin/java'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2m 13s
67 actionable tasks: 15 executed, 52 up-to-date
Publishing build scan...
https://gradle.com/s/dh3bee2hasfvy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal :
beam_PostCommit_Java_Nexmark_Spark #1037
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/1037/display/redirect>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org