You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2019/08/05 22:08:28 UTC
Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Spark #3299
See <https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/3299/display/redirect?page=changes>
Changes:
[dcavazos] [BEAM-7389] Add helper conversion samples and simplified tests
------------------------------------------
[...truncated 50.31 KB...]
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
19/08/05 22:07:33 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.io.IOException: Failed to create local dir in /tmp/blockmgr-7b5e072b-31b9-49fe-ac87-a7efa88accca/13.
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
19/08/05 22:07:33 ERROR org.apache.spark.executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1)
java.io.IOException: Failed to create local dir in /tmp/blockmgr-7b5e072b-31b9-49fe-ac87-a7efa88accca/32.
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
19/08/05 22:07:33 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: Failed to create local dir in /tmp/blockmgr-7b5e072b-31b9-49fe-ac87-a7efa88accca/13.
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
19/08/05 22:07:33 ERROR org.apache.spark.scheduler.TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
19/08/05 22:07:33 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 3.0 in stage 0.0 (TID 3, localhost, executor driver): java.io.IOException: Failed to create local dir in /tmp/blockmgr-7b5e072b-31b9-49fe-ac87-a7efa88accca/0b.
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
19/08/05 22:07:33 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 2.0 in stage 0.0 (TID 2, localhost, executor driver): java.io.IOException: Failed to create local dir in /tmp/blockmgr-7b5e072b-31b9-49fe-ac87-a7efa88accca/0d.
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
19/08/05 22:07:33 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, localhost, executor driver): java.io.IOException: Failed to create local dir in /tmp/blockmgr-7b5e072b-31b9-49fe-ac87-a7efa88accca/32.
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
19/08/05 22:07:33 WARN org.apache.spark.storage.BlockManager: Putting block rdd_10_5 failed due to exception org.apache.spark.TaskKilledException.
19/08/05 22:07:33 WARN org.apache.spark.storage.BlockManager: Putting block rdd_10_6 failed due to exception org.apache.spark.TaskKilledException.
19/08/05 22:07:33 WARN org.apache.spark.storage.BlockManager: Putting block rdd_10_4 failed due to exception org.apache.spark.TaskKilledException.
19/08/05 22:07:33 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 6.0 in stage 0.0 (TID 6, localhost, executor driver): TaskKilled (Stage cancelled)
19/08/05 22:07:33 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 5.0 in stage 0.0 (TID 5, localhost, executor driver): TaskKilled (Stage cancelled)
19/08/05 22:07:33 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 4.0 in stage 0.0 (TID 4, localhost, executor driver): TaskKilled (Stage cancelled)
2019-08-05T22:07:33.885Z Running query:HOT_ITEMS; exportSummaryToBigQuery:true; streamTimeout:60
==========================================================================================
Run started 2019-08-05T22:07:15.068Z and ran for PT18.817S
Default configuration:
{"debug":true,"query":null,"sourceType":"DIRECT","sinkType":"DEVNULL","exportSummaryToBigQuery":false,"pubSubMode":"COMBINED","sideInputType":"DIRECT","sideInputRowCount":500,"sideInputNumShards":3,"sideInputUrl":null,"sessionGap":{"standardDays":0,"standardHours":0,"standardMinutes":10,"standardSeconds":600,"millis":600000},"numEvents":100000,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":10000,"nextEventRate":10000,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":100000,"usePubsubPublishTime":false,"outOfOrderGroupSize":1}
Configurations:
Conf Description
0000 query:PASSTHROUGH; exportSummaryToBigQuery:true; streamTimeout:60
0001 query:CURRENCY_CONVERSION; exportSummaryToBigQuery:true; streamTimeout:60
0002 query:SELECTION; exportSummaryToBigQuery:true; streamTimeout:60
0003 query:LOCAL_ITEM_SUGGESTION; exportSummaryToBigQuery:true; streamTimeout:60
0004 query:AVERAGE_PRICE_FOR_CATEGORY; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
0005 query:HOT_ITEMS; exportSummaryToBigQuery:true; streamTimeout:60
0006 query:AVERAGE_SELLING_PRICE_BY_SELLER; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
0007 query:HIGHEST_BID; exportSummaryToBigQuery:true; streamTimeout:60
0008 query:MONITOR_NEW_USERS; exportSummaryToBigQuery:true; streamTimeout:60
0009 query:WINNING_BIDS; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
0010 query:LOG_TO_SHARDED_FILES; exportSummaryToBigQuery:true; streamTimeout:60
0011 query:USER_SESSIONS; exportSummaryToBigQuery:true; streamTimeout:60
0012 query:PROCESSING_TIME_WINDOWS; exportSummaryToBigQuery:true; streamTimeout:60
0013 query:BOUNDED_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; streamTimeout:60
0014 query:SESSION_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; streamTimeout:60
Performance:
Conf Runtime(sec) (Baseline) Events(/sec) (Baseline) Results (Baseline)
0000 1.3 75188.0 100000
0001 0.6 165837.5 92000
0002 0.3 326797.4 351
Exception in thread "main" 0003 2.1 46882.3 580
java.lang.RuntimeException: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.IOException: Failed to create local dir in /tmp/blockmgr-7b5e072b-31b9-49fe-ac87-a7efa88accca/13.
at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:128)
0004 *** not run ***
at org.apache.beam.sdk.nexmark.Main.main(Main.java:415)
0005 *** not run ***
0006 *** not run ***
0007 *** not run ***
0008 *** not run ***
0009 *** not run ***
0010 *** not run ***
0011 *** not run ***
0012 *** not run ***
0013 *** not run ***
0014 *** not run ***
==========================================================================================
Caused by: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.IOException: Failed to create local dir in /tmp/blockmgr-7b5e072b-31b9-49fe-ac87-a7efa88accca/13.
at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:67)
at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:98)
at org.apache.beam.sdk.nexmark.NexmarkLauncher.run(NexmarkLauncher.java:1178)
at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:90)
at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:79)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Failed to create local dir in /tmp/blockmgr-7b5e072b-31b9-49fe-ac87-a7efa88accca/13.
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
... 3 more
2019-08-05T22:07:33.888Z Generating 100000 events in batch mode
2019-08-05T22:07:38.418Z Waiting for main pipeline to 'finish'
2019-08-05T22:07:38.418Z DONE Query5
2019-08-05T22:07:38.424Z Running query:AVERAGE_SELLING_PRICE_BY_SELLER; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
2019-08-05T22:07:38.427Z Generating 10000 events in batch mode
2019-08-05T22:07:38.433Z Expected auction duration is 16667 ms
2019-08-05T22:07:42.409Z Waiting for main pipeline to 'finish'
2019-08-05T22:07:42.410Z DONE Query6
2019-08-05T22:07:42.415Z Running query:HIGHEST_BID; exportSummaryToBigQuery:true; streamTimeout:60
2019-08-05T22:07:42.418Z Generating 100000 events in batch mode
2019-08-05T22:07:48.011Z Waiting for main pipeline to 'finish'
2019-08-05T22:07:48.011Z DONE Query7
2019-08-05T22:07:48.016Z Running query:MONITOR_NEW_USERS; exportSummaryToBigQuery:true; streamTimeout:60
2019-08-05T22:07:48.019Z Generating 100000 events in batch mode
2019-08-05T22:07:53.202Z Waiting for main pipeline to 'finish'
2019-08-05T22:07:53.202Z DONE Query8
2019-08-05T22:07:53.207Z Running query:WINNING_BIDS; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
2019-08-05T22:07:53.209Z Generating 10000 events in batch mode
2019-08-05T22:07:53.215Z Expected auction duration is 16667 ms
2019-08-05T22:07:57.716Z Waiting for main pipeline to 'finish'
2019-08-05T22:07:57.716Z DONE Query9
2019-08-05T22:07:57.719Z Running query:LOG_TO_SHARDED_FILES; exportSummaryToBigQuery:true; streamTimeout:60
2019-08-05T22:07:57.722Z Generating 100000 events in batch mode
19/08/05 22:07:57 WARN org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: Either coder VoidCoder or IntervalWindow$IntervalWindowCoder is not consistent with equals. That might cause issues on some runners.
2019-08-05T22:08:02.563Z Waiting for main pipeline to 'finish'
2019-08-05T22:08:02.563Z DONE Query10
2019-08-05T22:08:02.566Z Running query:USER_SESSIONS; exportSummaryToBigQuery:true; streamTimeout:60
2019-08-05T22:08:02.568Z Generating 100000 events in batch mode
2019-08-05T22:08:08.497Z Waiting for main pipeline to 'finish'
2019-08-05T22:08:08.497Z DONE Query11
2019-08-05T22:08:08.499Z Running query:PROCESSING_TIME_WINDOWS; exportSummaryToBigQuery:true; streamTimeout:60
2019-08-05T22:08:08.502Z Generating 100000 events in batch mode
2019-08-05T22:08:13.737Z Waiting for main pipeline to 'finish'
2019-08-05T22:08:13.737Z DONE Query12
2019-08-05T22:08:13.740Z Running query:BOUNDED_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; streamTimeout:60
2019-08-05T22:08:13.742Z Generating 100000 events in batch mode
2019-08-05T22:08:18.717Z Waiting for main pipeline to 'finish'
2019-08-05T22:08:18.717Z DONE BoundedSideInputJoin
2019-08-05T22:08:18.720Z Running query:SESSION_SIDE_INPUT_JOIN; exportSummaryToBigQuery:true; streamTimeout:60
2019-08-05T22:08:18.722Z Generating 100000 events in batch mode
2019-08-05T22:08:24.847Z Waiting for main pipeline to 'finish'
2019-08-05T22:08:24.847Z DONE BoundedSideInputJoin
> Task :sdks:java:testing:nexmark:run FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:testing:nexmark:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2m 1s
69 actionable tasks: 46 executed, 23 from cache
Publishing build scan...
https://gradle.com/s/d3krj3gq7qkx6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal :
beam_PostCommit_Java_Nexmark_Spark #3300
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/3300/display/redirect?page=changes>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org