You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2015/04/02 19:16:46 UTC
Build failed in Jenkins: Mahout-Quality #3043
See <https://builds.apache.org/job/Mahout-Quality/3043/>
------------------------------------------
Started by timer
Building remotely on ubuntu-1 (docker Ubuntu ubuntu ubuntu1) in workspace <https://builds.apache.org/job/Mahout-Quality/ws/>
> git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
> git config remote.origin.url https://git-wip-us.apache.org/repos/asf/mahout.git # timeout=10
Fetching upstream changes from https://git-wip-us.apache.org/repos/asf/mahout.git
> git --version # timeout=10
FATAL: Failed to fetch from https://git-wip-us.apache.org/repos/asf/mahout.git
hudson.plugins.git.GitException: Failed to fetch from https://git-wip-us.apache.org/repos/asf/mahout.git
at hudson.plugins.git.GitSCM.fetchFrom(GitSCM.java:647)
at hudson.plugins.git.GitSCM.retrieveChanges(GitSCM.java:889)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:914)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1252)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:615)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:86)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:524)
at hudson.model.Run.execute(Run.java:1706)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:232)
Caused by: hudson.plugins.git.GitException: Failed to connect to https://git-wip-us.apache.org/repos/asf/mahout.git (exception: org.apache.http.conn.ConnectTimeoutException: Connect to git-wip-us.apache.org:443 [git-wip-us.apache.org/140.211.11.23] failed: Connection timed out)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.checkCredentials(CliGitAPIImpl.java:2253)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandWithCredentials(CliGitAPIImpl.java:1169)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.access$300(CliGitAPIImpl.java:85)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl$1.execute(CliGitAPIImpl.java:280)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:153)
at org.jenkinsci.plugins.gitclient.RemoteGitImpl$CommandInvocationHandler$1.call(RemoteGitImpl.java:146)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:328)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Jenkins build is back to normal : Mahout-Quality #3045
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mahout-Quality/3045/changes>
Build failed in Jenkins: Mahout-Quality #3044
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Mahout-Quality/3044/changes>
Changes:
[apalumbo] (nojira) set spark.executor.memory = 1g in spark-shell. fix -ma option in 20newsgroups shell script. a few other mostly cosmetic changes, version numbers, and shell script example. closes apache/mahout#95
------------------------------------------
[...truncated 6681 lines...]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
2484 [sparkDriver-akka.actor.default-dispatcher-17] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[32mDrmLikeOpsSuite:[0m
{
0 => {0:2.0,1:3.0,2:4.0}
1 => {0:3.0,1:4.0,2:5.0}
2 => {0:4.0,1:5.0,2:6.0}
3 => {0:5.0,1:6.0,2:7.0}
}
[32m- mapBlock[0m
{
0 => {0:2.0,1:3.0}
1 => {0:3.0,1:4.0}
2 => {0:4.0,1:5.0}
3 => {0:5.0,1:6.0}
}
[32m- col range[0m
{
0 => {0:2.0,1:3.0,2:4.0}
1 => {0:3.0,1:4.0,2:5.0}
}
[32m- row range[0m
{
0 => {0:3.0,1:4.0}
1 => {0:4.0,1:5.0}
}
[32m- col, row range[0m
[32m- exact, min and auto ||[0m
[32mNBSparkTestSuite:[0m
[32m- Simple Standard NB Model[0m
[32m- NB Aggregator[0m
[32m- Model DFS Serialization[0m
[32m- Spark NB Aggregator[0m
8213 [sparkDriver-akka.actor.default-dispatcher-18] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[32mItemSimilarityDriverSuite:[0m
[32m- ItemSimilarityDriver, non-full-spec CSV[0m
[32m- ItemSimilarityDriver TSV [0m
[32m- ItemSimilarityDriver log-ish files[0m
[32m- ItemSimilarityDriver legacy supported file format[0m
[32m- ItemSimilarityDriver write search engine output[0m
[32m- ItemSimilarityDriver recursive file discovery using filename patterns[0m
[32m- ItemSimilarityDriver, two input paths[0m
[32m- ItemSimilarityDriver, two inputs of different dimensions[0m
[32m- ItemSimilarityDriver cross similarity two separate items spaces[0m
[32m- A.t %*% B after changing row cardinality of A[0m
[32m- Changing row cardinality of an IndexedDataset[0m
[32m- ItemSimilarityDriver cross similarity two separate items spaces, missing rows in B[0m
log4j:WARN No appenders could be found for logger (akka.remote.RemoteActorRefProvider$RemotingTerminator).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
[32mBlasSuite:[0m
AB' num partitions = 2.
{
2 => {0:50.0,1:74.0}
1 => {0:38.0,1:56.0}
0 => {0:26.0,1:38.0}
}
[32m- ABt[0m
[32m- A * B Hadamard[0m
[32m- A + B Elementwise[0m
[32m- A - B Elementwise[0m
[32m- A / B Elementwise[0m
{
0 => {0:5.0,1:8.0}
1 => {0:8.0,1:13.0}
}
{
0 => {0:5.0,1:8.0}
1 => {0:8.0,1:13.0}
}
[32m- AtA slim[0m
{
0 => {0:1.0,1:2.0,2:3.0}
1 => {0:2.0,1:3.0,2:4.0}
2 => {0:3.0,1:4.0,2:5.0}
}
[32m- At[0m
[32mSimilarityAnalysisSuite:[0m
[32m- cooccurrence [A'A], [B'A] boolbean data using LLR[0m
[32m- cooccurrence [A'A], [B'A] double data using LLR[0m
[32m- cooccurrence [A'A], [B'A] integer data using LLR[0m
[32m- cooccurrence two matrices with different number of columns[0m
[32m- LLR calc[0m
[32m- downsampling by number per row[0m
[32mClassifierStatsSparkTestSuite:[0m
[32m- testFullRunningAverageAndStdDev[0m
[32m- testBigFullRunningAverageAndStdDev[0m
[32m- testStddevFullRunningAverageAndStdDev[0m
[32m- testFullRunningAverage[0m
[32m- testFullRunningAveragCopyConstructor[0m
[32m- testInvertedRunningAverage[0m
[32m- testInvertedRunningAverageAndStdDev[0m
[32m- testBuild[0m
[32m- GetMatrix[0m
[32m- testPrecisionRecallAndF1ScoreAsScikitLearn[0m
32505 [sparkDriver-akka.actor.default-dispatcher-5] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator - Shutting down remote daemon.
[32mRLikeDrmOpsSuite:[0m
[32m- A.t[0m
{
1 => {0:25.0,1:39.0}
0 => {0:11.0,1:17.0}
}
{
1 => {0:25.0,1:39.0}
0 => {0:11.0,1:17.0}
}
[32m- C = A %*% B[0m
{
0 => {0:11.0,1:17.0}
1 => {0:25.0,1:39.0}
}
{
0 => {0:11.0,1:17.0}
1 => {0:25.0,1:39.0}
}
Q=
{
0 => {0:0.40273861426601687,1:-0.9153150324187648}
1 => {0:0.9153150324227656,1:0.40273861426427493}
}
[32m- C = A %*% B mapBlock {}[0m
[32m- C = A %*% B incompatible B keys[0m
[32m- Spark-specific C = At %*% B , join[0m
[32m- C = At %*% B , join, String-keyed[0m
[32m- C = At %*% B , zippable, String-keyed[0m
{
0 => {0:26.0,1:35.0,2:46.0,3:51.0}
1 => {0:50.0,1:69.0,2:92.0,3:105.0}
2 => {0:62.0,1:86.0,2:115.0,3:132.0}
3 => {0:74.0,1:103.0,2:138.0,3:159.0}
}
[32m- C = A %*% inCoreB[0m
{
0 => {0:26.0,1:35.0,2:46.0,3:51.0}
1 => {0:50.0,1:69.0,2:92.0,3:105.0}
2 => {0:62.0,1:86.0,2:115.0,3:132.0}
3 => {0:74.0,1:103.0,2:138.0,3:159.0}
}
[32m- C = inCoreA %*%: B[0m
[32m- C = A.t %*% A[0m
[32m- C = A.t %*% A fat non-graph[0m
[32m- C = A.t %*% A non-int key[0m
[32m- C = A + B[0m
A=
{
0 => {0:1.0,1:2.0,2:3.0}
1 => {0:3.0,1:4.0,2:5.0}
2 => {0:5.0,1:6.0,2:7.0}
}
B=
{
0 => {0:0.6071828736747092,1:0.03658255164056168,2:0.409333795339629}
1 => {0:0.6181601353236361,1:0.26440320571545295,2:0.3915912514714457}
2 => {0:0.2719595981451286,1:0.07037439372150167,2:0.3415545531128936}
}
C=
{
0 => {0:1.6071828736747094,1:2.036582551640562,2:3.409333795339629}
1 => {0:3.6181601353236363,1:4.264403205715453,2:5.3915912514714455}
2 => {0:5.271959598145129,1:6.070374393721502,2:7.341554553112894}
}
[32m- C = A + B, identically partitioned[0m
[32m- C = A + B side test 1[0m
[32m- C = A + B side test 2[0m
[32m- C = A + B side test 3[0m
[32m- Ax[0m
[32m- A'x[0m
[32m- colSums, colMeans[0m
[32m- rowSums, rowMeans[0m
[32m- A.diagv[0m
[32m- numNonZeroElementsPerColumn[0m
[32m- C = A cbind B, cogroup[0m
[32m- C = A cbind B, zip[0m
[32m- B = A + 1.0[0m
[32m- C = A rbind B[0m
[32m- C = A rbind B, with empty[0m
[32m- scalarOps[0m
[32m- C = A + B missing rows[0m
[32m- C = cbind(A, B) with missing rows[0m
collected A =
{
0 => {0:1.0,1:2.0,2:3.0}
1 => {}
2 => {}
3 => {0:3.0,1:4.0,2:5.0}
}
collected B =
{
2 => {0:1.0,1:1.0,2:1.0}
1 => {0:1.0,1:1.0,2:1.0}
3 => {0:4.0,1:5.0,2:6.0}
0 => {0:2.0,1:3.0,2:4.0}
}
[32m- B = A + 1.0 missing rows[0m
[32mTFIDFSparkTestSuite:[0m
[32m- TF test[0m
[32m- TFIDF test[0m
[32m- MLlib TFIDF test[0m
[36mRun completed in 1 minute, 26 seconds.[0m
[36mTotal number of tests run: 92[0m
[36mSuites: completed 13, aborted 0[0m
[36mTests: succeeded 90, failed 2, canceled 0, ignored 1, pending 0[0m
[31m*** 2 TESTS FAILED ***[0m
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Mahout Build Tools ................................ SUCCESS [5.053s]
[INFO] Apache Mahout ..................................... SUCCESS [1.172s]
[INFO] Mahout Math ....................................... SUCCESS [2:08.230s]
[INFO] Mahout HDFS ....................................... SUCCESS [14.343s]
[INFO] Mahout Map-Reduce ................................. SUCCESS [13:46.969s]
[INFO] Mahout Integration ................................ SUCCESS [1:18.840s]
[INFO] Mahout Examples ................................... SUCCESS [44.999s]
[INFO] Mahout Release Package ............................ SUCCESS [0.507s]
[INFO] Mahout Math Scala bindings ........................ SUCCESS [1:56.049s]
[INFO] Mahout Spark bindings ............................. FAILURE [2:05.329s]
[INFO] Mahout Spark bindings shell ....................... SKIPPED
[INFO] Mahout H2O backend ................................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 22:23.536s
[INFO] Finished at: Thu Apr 02 21:16:37 UTC 2015
[INFO] Final Memory: 74M/419M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (test) on project mahout-spark_2.10: There are test failures -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :mahout-spark_2.10
Build step 'Invoke top-level Maven targets' marked build as failure
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Sending artifact delta relative to Mahout-Quality #3042
Archived 94 artifacts
Archive block size is 32768
Received 6927 blocks and 15061012 bytes
Compression is 93.8%
Took 1 min 0 sec
Recording test results
Publishing Javadoc