You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2018/10/01 15:33:42 UTC

Build failed in Jenkins: beam_PostCommit_Java_Nexmark_Spark #612

See <https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/612/display/redirect?page=changes>

Changes:

[lcwik] [BEAM-5487] ByteKeyRangeTracker restrictions do not cover the entire

------------------------------------------
[...truncated 216.79 KB...]
  Conf  Runtime(sec)    (Baseline)  Events(/sec)    (Baseline)       Results    (Baseline)
  0000           2.8                     35310.7                      100000              
  0001           1.7                     59701.5                       92000              
  0002           1.5                     66445.2                         351              
  0003           5.1                     19782.4                         580              
  0004           1.7                      5757.1                          40              
  0005           2.9                     35075.4                          12              
  0006           1.5                      6635.7                         103              
  0007           4.5                     22041.0                           1              
  0008           3.4                     29095.1                        6000              
  0009           1.9                      5402.5                         298              
  0010  *** not run ***
  0011  *** not run ***
  0012  *** not run ***
==========================================================================================

2018-10-01T15:33:16.642Z Generating 100000 events in batch mode
18/10/01 15:33:21 ERROR org.apache.spark.executor.Executor: Exception in task 2.0 in stage 0.0 (TID 2)
java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/1c.
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
	at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:109)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
18/10/01 15:33:21 ERROR org.apache.spark.executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/20.
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
	at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:109)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
18/10/01 15:33:21 ERROR org.apache.spark.executor.Executor: Exception in task 3.0 in stage 0.0 (TID 3)
java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/3b.
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
	at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:109)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
18/10/01 15:33:21 ERROR org.apache.spark.executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1)
java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/1e.
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
	at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:109)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
18/10/01 15:33:21 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 3.0 in stage 0.0 (TID 3, localhost, executor driver): java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/3b.
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
	at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:109)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

18/10/01 15:33:21 ERROR org.apache.spark.scheduler.TaskSetManager: Task 3 in stage 0.0 failed 1 times; aborting job
18/10/01 15:33:21 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, localhost, executor driver): java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/1e.
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
	at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:109)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

18/10/01 15:33:21 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 2.0 in stage 0.0 (TID 2, localhost, executor driver): java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/1c.
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
	at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:109)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

18/10/01 15:33:21 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/20.
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
	at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:109)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

2018-10-01T15:33:21.351Z Running query:11; exportSummaryToBigQuery:true; streamTimeout:60

Exception in thread "main" ==========================================================================================
Run started 2018-10-01T15:32:07.792Z and ran for PT73.559S

Default configuration:
{"debug":true,"query":0,"sourceType":"DIRECT","sinkType":"DEVNULL","exportSummaryToBigQuery":false,"pubSubMode":"COMBINED","numEvents":100000,"numEventGenerators":100,"rateShape":"SINE","firstEventRate":10000,"nextEventRate":10000,"rateUnit":"PER_SECOND","ratePeriodSec":600,"preloadSeconds":0,"streamTimeout":240,"isRateLimited":false,"useWallclockEventTime":false,"avgPersonByteSize":200,"avgAuctionByteSize":500,"avgBidByteSize":100,"hotAuctionRatio":2,"hotSellersRatio":4,"hotBiddersRatio":4,"windowSizeSec":10,"windowPeriodSec":5,"watermarkHoldbackSec":0,"numInFlightAuctions":100,"numActivePeople":1000,"coderStrategy":"HAND","cpuDelayMs":0,"diskBusyBytes":0,"auctionSkip":123,"fanout":5,"maxAuctionsWaitingTime":600,"occasionalDelaySec":3,"probDelayedEvent":0.1,"maxLogEvents":100000,"usePubsubPublishTime":false,"outOfOrderGroupSize":1}

Configurations:
  Conf  Description
  0000  query:0; exportSummaryToBigQuery:true; streamTimeout:60
  0001  query:1; exportSummaryToBigQuery:true; streamTimeout:60
  0002  query:2; exportSummaryToBigQuery:true; streamTimeout:60
  0003  query:3; exportSummaryToBigQuery:true; streamTimeout:60
  0004  query:4; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
  0005  query:5; exportSummaryToBigQuery:true; streamTimeout:60
  0006  query:6; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
  0007  query:7; exportSummaryToBigQuery:true; streamTimeout:60
java.lang.RuntimeException: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/3b.
  0008  query:8; exportSummaryToBigQuery:true; streamTimeout:60
	at org.apache.beam.sdk.nexmark.Main.runAll(Main.java:144)
	at org.apache.beam.sdk.nexmark.Main.main(Main.java:477)
  0009  query:9; exportSummaryToBigQuery:true; numEvents:10000; streamTimeout:60
  0010  query:10; exportSummaryToBigQuery:true; streamTimeout:60
  0011  query:11; exportSummaryToBigQuery:true; streamTimeout:60
  0012  query:12; exportSummaryToBigQuery:true; streamTimeout:60

Performance:
  Conf  Runtime(sec)    (Baseline)  Events(/sec)    (Baseline)       Results    (Baseline)
  0000           2.8                     35310.7                      100000              
  0001           1.7                     59701.5                       92000              
  0002           1.5                     66445.2                         351              
  0003           5.1                     19782.4                         580              
  0004           1.7                      5757.1                          40              
  0005           2.9                     35075.4                          12              
  0006           1.5                      6635.7                         103              
  0007           4.5                     22041.0                           1              
  0008           3.4                     29095.1                        6000              
  0009           1.9                      5402.5                         298              
  0010  *** not run ***
  0011  *** not run ***
  0012  *** not run ***
==========================================================================================

Caused by: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/3b.
	at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:68)
	at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:99)
	at org.apache.beam.sdk.nexmark.NexmarkLauncher.run(NexmarkLauncher.java:1280)
	at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:108)
	at org.apache.beam.sdk.nexmark.Main$Run.call(Main.java:96)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
2018-10-01T15:33:21.355Z Generating 100000 events in batch mode
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Failed to create local dir in /tmp/blockmgr-35bdd237-ccd0-4157-ba2e-0eab765adc5a/3b.
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70)
	at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:80)
	at org.apache.spark.storage.DiskBlockManager.createTempShuffleBlock(DiskBlockManager.scala:127)
	at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:137)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
	at org.apache.spark.scheduler.Task.run(Task.scala:109)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	... 3 more
18/10/01 15:33:21 ERROR org.apache.spark.util.Utils: Uncaught exception in thread Executor task launch worker for task 4
java.lang.NullPointerException
	at org.apache.spark.scheduler.Task$$anonfun$run$1.apply$mcV$sp(Task.scala:130)
	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1360)
	at org.apache.spark.scheduler.Task.run(Task.scala:128)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
18/10/01 15:33:21 ERROR org.apache.spark.util.Utils: Uncaught exception in thread Executor task launch worker for task 5
java.lang.NullPointerException
	at org.apache.spark.scheduler.Task$$anonfun$run$1.apply$mcV$sp(Task.scala:130)
	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1360)
	at org.apache.spark.scheduler.Task.run(Task.scala:128)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
18/10/01 15:33:21 ERROR org.apache.spark.util.Utils: Uncaught exception in thread Executor task launch worker for task 6
java.lang.NullPointerException
	at org.apache.spark.scheduler.Task$$anonfun$run$1.apply$mcV$sp(Task.scala:130)
	at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1360)
	at org.apache.spark.scheduler.Task.run(Task.scala:128)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
2018-10-01T15:33:29.269Z Waiting for main pipeline to 'finish'
2018-10-01T15:33:29.269Z DONE Query11
2018-10-01T15:33:29.271Z Running query:12; exportSummaryToBigQuery:true; streamTimeout:60
2018-10-01T15:33:29.274Z Generating 100000 events in batch mode
2018-10-01T15:33:37.810Z Waiting for main pipeline to 'finish'
2018-10-01T15:33:37.810Z DONE Query12

> Task :beam-sdks-java-nexmark:run FAILED
:beam-sdks-java-nexmark:run (Thread[Task worker for ':' Thread 10,5,main]) completed. Took 1 mins 32.296 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':beam-sdks-java-nexmark:run'.
> Process 'command '/usr/local/asfpackages/java/jdk1.8.0_172/bin/java'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 4m 58s
64 actionable tasks: 60 executed, 4 from cache

Publishing build scan...
https://gradle.com/s/2gkjukwxnixes

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

Jenkins build is back to normal : beam_PostCommit_Java_Nexmark_Spark #613

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/beam_PostCommit_Java_Nexmark_Spark/613/display/redirect?page=changes>