You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2014/10/11 19:34:34 UTC

Build failed in Jenkins: Mahout-Quality #2823

See <https://builds.apache.org/job/Mahout-Quality/2823/>

------------------------------------------
[...truncated 6180 lines...]
	at org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:61)
	at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:141)
	at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1814)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1773)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:85)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:165)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
	at java.lang.Thread.run(Thread.java:662)
13 [Result resolver thread-1] ERROR org.apache.spark.scheduler.TaskSetManager  - Task 67.0:0 failed 1 times; aborting job
- cooccurrence two matrices with different number of columns *** FAILED ***
  org.apache.spark.SparkException: Job aborted due to stage failure: Task 67.0:0 failed 1 times, most recent failure: Exception failure in TID 137 on host localhost: java.io.FileNotFoundException: http://67.195.81.155:42012/broadcast_13
        sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1457)
        org.apache.spark.broadcast.HttpBroadcast$.read(HttpBroadcast.scala:196)
        org.apache.spark.broadcast.HttpBroadcast.readObject(HttpBroadcast.scala:89)
        sun.reflect.GeneratedMethodAccessor7.invoke(Unknown Source)
        sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        java.lang.reflect.Method.invoke(Method.java:597)
        java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
        scala.collection.immutable.$colon$colon.readObject(List.scala:362)
        sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
        sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        java.lang.reflect.Method.invoke(Method.java:597)
        java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
        scala.collection.immutable.$colon$colon.readObject(List.scala:362)
        sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
        sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        java.lang.reflect.Method.invoke(Method.java:597)
        java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1871)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1969)
        java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1775)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
        org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
        org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:61)
        org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:141)
        java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1814)
        java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1773)
        java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1327)
        java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
        org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:63)
        org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:85)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:165)
        java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        java.lang.Thread.run(Thread.java:662)
Driver stacktrace:
  at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1044)
  at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1028)
  at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1026)
  at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
  at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1026)
  at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:634)
  at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:634)
  at scala.Option.foreach(Option.scala:236)
  at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:634)
  ...
- LLR calc
- downsampling by number per row
RLikeDrmOpsSuite:
- A.t
{
  1  =>	{0:25.0,1:39.0}
  0  =>	{0:11.0,1:17.0}
}
{
  1  =>	{0:25.0,1:39.0}
  0  =>	{0:11.0,1:17.0}
}
- C = A %*% B
{
  0  =>	{0:11.0,1:17.0}
  1  =>	{0:25.0,1:39.0}
}
{
  0  =>	{0:11.0,1:17.0}
  1  =>	{0:25.0,1:39.0}
}
Q=
{
  0  =>	{0:0.40273861426601687,1:-0.9153150324187648}
  1  =>	{0:0.9153150324227656,1:0.40273861426427493}
}
- C = A %*% B mapBlock {}
- C = A %*% B incompatible B keys
- Spark-specific C = At %*% B , join
- C = At %*% B , join, String-keyed
- C = At %*% B , zippable, String-keyed
{
  0  =>	{0:26.0,1:35.0,2:46.0,3:51.0}
  1  =>	{0:50.0,1:69.0,2:92.0,3:105.0}
  2  =>	{0:62.0,1:86.0,2:115.0,3:132.0}
  3  =>	{0:74.0,1:103.0,2:138.0,3:159.0}
}
- C = A %*% inCoreB
{
  0  =>	{0:26.0,1:35.0,2:46.0,3:51.0}
  1  =>	{0:50.0,1:69.0,2:92.0,3:105.0}
  2  =>	{0:62.0,1:86.0,2:115.0,3:132.0}
  3  =>	{0:74.0,1:103.0,2:138.0,3:159.0}
}
- C = inCoreA %*%: B
- C = A.t %*% A
- C = A.t %*% A fat non-graph
- C = A.t %*% A non-int key
- C = A + B
A=
{
  0  =>	{0:1.0,1:2.0,2:3.0}
  1  =>	{0:3.0,1:4.0,2:5.0}
  2  =>	{0:5.0,1:6.0,2:7.0}
}
B=
{
  0  =>	{0:0.6466025679349946,1:0.091268535935215,2:0.17249685953169158}
  1  =>	{0:0.904341630593639,1:0.9811980128492591,2:0.9353580805921472}
  2  =>	{0:0.25262534129850467,1:0.6321384963248696,2:0.47291500929689123}
}
C=
{
  0  =>	{0:1.6466025679349947,1:2.091268535935215,2:3.172496859531692}
  1  =>	{0:3.904341630593639,1:4.9811980128492594,2:5.935358080592147}
  2  =>	{0:5.252625341298504,1:6.6321384963248695,2:7.472915009296891}
}
- C = A + B, identically partitioned
- C = A + B side test 1
- C = A + B side test 2
- C = A + B side test 3
- Ax
- A'x
- colSums, colMeans
- rowSums, rowMeans
- A.diagv
- numNonZeroElementsPerColumn
- C = A cbind B, cogroup
- C = A cbind B, zip
- B = A + 1.0
- C = A rbind B
- C = A rbind B, with empty
- scalarOps
- C = A + B missing rows
- C = cbind(A, B) with missing rows
collected A = 
{
  0  =>	{0:1.0,1:2.0,2:3.0}
  1  =>	{}
  2  =>	{}
  3  =>	{0:3.0,1:4.0,2:5.0}
}
collected B = 
{
  2  =>	{0:1.0,1:1.0,2:1.0}
  1  =>	{0:1.0,1:1.0,2:1.0}
  3  =>	{0:4.0,1:5.0,2:6.0}
  0  =>	{0:2.0,1:3.0,2:4.0}
}
- B = A + 1.0 missing rows
Run completed in 1 minute, 12 seconds.
Total number of tests run: 74
Suites: completed 10, aborted 0
Tests: succeeded 73, failed 1, canceled 0, ignored 1, pending 0
*** 1 TEST FAILED ***
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Mahout Build Tools ................................ SUCCESS [3.510s]
[INFO] Apache Mahout ..................................... SUCCESS [1.509s]
[INFO] Mahout Math ....................................... SUCCESS [2:14.285s]
[INFO] Mahout MapReduce Legacy ........................... SUCCESS [10:40.168s]
[INFO] Mahout Integration ................................ SUCCESS [1:11.919s]
[INFO] Mahout Examples ................................... SUCCESS [41.450s]
[INFO] Mahout Release Package ............................ SUCCESS [0.053s]
[INFO] Mahout Math Scala bindings ........................ SUCCESS [1:58.491s]
[INFO] Mahout Spark bindings ............................. FAILURE [1:53.488s]
[INFO] Mahout Spark bindings shell ....................... SKIPPED
[INFO] Mahout H2O backend ................................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 18:46.737s
[INFO] Finished at: Sat Oct 11 17:33:57 UTC 2014
[INFO] Final Memory: 81M/418M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0-M2:test (test) on project mahout-spark_2.10: There are test failures -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :mahout-spark_2.10
Build step 'Invoke top-level Maven targets' marked build as failure
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Sending artifact delta relative to Mahout-Quality #2822
Archived 72 artifacts
Archive block size is 32768
Received 2278 blocks and 16675152 bytes
Compression is 81.7%
Took 17 sec
Recording test results
Publishing Javadoc

Jenkins build is back to normal : Mahout-Quality #2824

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mahout-Quality/2824/>