You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@systemml.apache.org by je...@spark.tc on 2017/06/05 07:30:51 UTC

Build failed in Jenkins: SystemML-DailyTest #1032

See <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/1032/changes>

Changes:

[Matthias Boehm] [SYSTEMML-1664] Fix handling of non-default DFS file URI schemes

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
[EnvInject] - Preparing an environment for the build.
[EnvInject] - Keeping Jenkins system variables.
[EnvInject] - Keeping Jenkins build variables.
[EnvInject] - Injecting as environment variables the properties content 
PATH=/usr/local/bin:/usr/bin:/opt/apache-maven-3.3.3

[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting contributions.
Building remotely on SystemML - Jenkins Slave 01 (x86_64 slave01) in workspace <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/>
[WS-CLEANUP] Deleting project workspace...
[WS-CLEANUP] Done
Cloning the remote Git repository
Cloning repository https://github.com/apache/incubator-systemml.git
 > git init <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/> # timeout=10
Fetching upstream changes from https://github.com/apache/incubator-systemml.git
 > git --version # timeout=10
 > git -c core.askpass=true fetch --tags --progress https://github.com/apache/incubator-systemml.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/incubator-systemml.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://github.com/apache/incubator-systemml.git # timeout=10
Fetching upstream changes from https://github.com/apache/incubator-systemml.git
 > git -c core.askpass=true fetch --tags --progress https://github.com/apache/incubator-systemml.git +refs/pull/*:refs/remotes/origin/pr/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 557aae0ff3e069295da39bb6e280f435e9514cd9 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 557aae0ff3e069295da39bb6e280f435e9514cd9
 > git rev-list 88f4a468f48081031d926d917ebc4f3e9014fc7f # timeout=10
Run condition [Always] enabling prebuild for step [[]]
[SystemML-DailyTest] $ /bin/sh -xe /tmp/hudson7457480653090904448.sh
Parsing POMs
Modules changed, recalculating dependency graph
Established TCP socket on 36239
maven32-agent.jar already up to date
maven32-interceptor.jar already up to date
maven3-interceptor-commons.jar already up to date
[SystemML-DailyTest] $ /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.51-1.b16.el7_1.x86_64/bin/java -Xmx4096m -XX:MaxPermSize=4096m -cp /a/maven32-agent.jar:/opt/apache-maven-3.3.3/boot/plexus-classworlds-2.5.2.jar:/opt/apache-maven-3.3.3/conf/logging jenkins.maven3.agent.Maven32Main /opt/apache-maven-3.3.3 /a/slave.jar /a/maven32-interceptor.jar /a/maven3-interceptor-commons.jar 36239
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=4096m; support was removed in 8.0
<===[JENKINS REMOTING CAPACITY]===>   channel started
Executing Maven:  -B -f <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/pom.xml> versions:display-dependency-updates -Pdistribution,rat clean package
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building SystemML 1.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- versions-maven-plugin:2.4:display-dependency-updates (default-cli) @ systemml ---
[INFO] artifact junit:junit: checking for updates from central
[INFO] artifact commons-logging:commons-logging: checking for updates from central
[INFO] artifact commons-collections:commons-collections: checking for updates from central
[INFO] artifact com.google.protobuf:protobuf-java: checking for updates from central
[INFO] artifact log4j:log4j: checking for updates from central
[INFO] artifact org.antlr:antlr4: checking for updates from central
[INFO] artifact org.apache.hadoop:hadoop-client: checking for updates from central
[INFO] artifact org.apache.commons:commons-math3: checking for updates from central
[INFO] artifact org.antlr:antlr4-runtime: checking for updates from central
[INFO] artifact org.apache.hadoop:hadoop-common: checking for updates from central
[INFO] artifact org.apache.hadoop:hadoop-mapreduce-client-app: checking for updates from central
[INFO] artifact org.apache.hadoop:hadoop-hdfs: checking for updates from central
[INFO] artifact org.apache.hadoop:hadoop-mapreduce-client-common: checking for updates from central
[INFO] artifact org.apache.hadoop:hadoop-mapreduce-client-jobclient: checking for updates from central
[INFO] artifact org.apache.hadoop:hadoop-yarn-api: checking for updates from central
[INFO] artifact org.apache.hadoop:hadoop-yarn-client: checking for updates from central
[INFO] artifact org.apache.hadoop:hadoop-yarn-common: checking for updates from central
[INFO] artifact org.apache.spark:spark-mllib_2.11: checking for updates from central
[INFO] artifact org.apache.wink:wink-json4j: checking for updates from central
[INFO] artifact org.codehaus.janino:janino: checking for updates from central
[INFO] artifact org.jcuda:jcublas: checking for updates from central
[INFO] artifact org.jcuda:jcublas-natives: checking for updates from central
[INFO] artifact org.jcuda:jcuda: checking for updates from central
[INFO] artifact org.jcuda:jcuda-natives: checking for updates from central
[INFO] artifact org.jcuda:jcudnn: checking for updates from central
[INFO] artifact org.jcuda:jcudnn-natives: checking for updates from central
[INFO] artifact org.jcuda:jcufft: checking for updates from central
[INFO] artifact org.jcuda:jcufft-natives: checking for updates from central
[INFO] artifact org.jcuda:jcurand: checking for updates from central
[INFO] artifact org.jcuda:jcurand-natives: checking for updates from central
[INFO] artifact org.jcuda:jcusolver: checking for updates from central
[INFO] artifact org.jcuda:jcusolver-natives: checking for updates from central
[INFO] artifact org.jcuda:jcusparse: checking for updates from central
[INFO] artifact org.jcuda:jcusparse-natives: checking for updates from central
[INFO] artifact org.jcuda:jnvgraph: checking for updates from central
[INFO] artifact org.jcuda:jnvgraph-natives: checking for updates from central
[INFO] artifact org.mockito:mockito-core: checking for updates from central
[INFO] artifact org.scala-lang:scala-library: checking for updates from central
[INFO] artifact org.scala-lang:scalap: checking for updates from central
[INFO] artifact org.scalatest:scalatest_2.11: checking for updates from central
[INFO] The following dependencies in Dependencies have newer versions:
[INFO]   com.google.protobuf:protobuf-java ..................... 3.2.0 -> 3.3.1
[INFO]   commons-collections:commons-collections ............ 3.2.1 -> 20040616
[INFO]   commons-logging:commons-logging ......................... 1.1.3 -> 1.2
[INFO]   junit:junit ............................................. 4.11 -> 4.12
[INFO]   log4j:log4j ......................................... 1.2.15 -> 1.2.17
[INFO]   org.antlr:antlr4 ........................................ 4.5.3 -> 4.7
[INFO]   org.antlr:antlr4-runtime ................................ 4.5.3 -> 4.7
[INFO]   org.apache.commons:commons-math3 ...................... 3.4.1 -> 3.6.1
[INFO]   org.apache.hadoop:hadoop-client ................ 2.6.0 -> 3.0.0-alpha3
[INFO]   org.apache.hadoop:hadoop-common ................ 2.6.0 -> 3.0.0-alpha3
[INFO]   org.apache.hadoop:hadoop-hdfs .................. 2.6.0 -> 3.0.0-alpha3
[INFO]   org.apache.hadoop:hadoop-mapreduce-client-app ...
[INFO]                                                    2.6.0 -> 3.0.0-alpha3
[INFO]   org.apache.hadoop:hadoop-mapreduce-client-common ...
[INFO]                                                    2.6.0 -> 3.0.0-alpha3
[INFO]   org.apache.hadoop:hadoop-mapreduce-client-jobclient ...
[INFO]                                                    2.6.0 -> 3.0.0-alpha3
[INFO]   org.apache.hadoop:hadoop-yarn-api .............. 2.6.0 -> 3.0.0-alpha3
[INFO]   org.apache.hadoop:hadoop-yarn-client ........... 2.6.0 -> 3.0.0-alpha3
[INFO]   org.apache.hadoop:hadoop-yarn-common ........... 2.6.0 -> 3.0.0-alpha3
[INFO]   org.apache.spark:spark-mllib_2.11 ..................... 2.1.0 -> 2.1.1
[INFO]   org.codehaus.janino:janino ............................ 3.0.0 -> 3.0.7
[INFO]   org.mockito:mockito-core .............................. 1.9.5 -> 2.8.9
[INFO]   org.scala-lang:scala-library ..................... 2.11.8 -> 2.13.0-M1
[INFO]   org.scala-lang:scalap ............................ 2.11.8 -> 2.13.0-M1
[INFO]   org.scalatest:scalatest_2.11 .................... 2.2.6 -> 3.2.0-SNAP5
[INFO] 
[TASKS] Scanning folder '<https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/'> for files matching the pattern '*.java,*.scala,*.xml,*.txt' - excludes: 
[TASKS] Found 1 files to scan for tasks
Found 0 open tasks.
[TASKS] Computing warning deltas based on reference build #1031
log4j:WARN No appenders could be found for logger (org.apache.commons.beanutils.converters.BooleanConverter).
log4j:WARN Please initialize the log4j system properly.
[INFO] 
[INFO] --- maven-clean-plugin:3.0.0:clean (default-clean) @ systemml ---
[INFO] 
[INFO] --- maven-clean-plugin:3.0.0:clean (clean-python-files) @ systemml ---
[INFO] Deleting <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/python/systemml/mllearn> (includes = [*.pyc], excludes = [])
[INFO] Deleting <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/python/systemml> (includes = [project_info.py, *.pyc], excludes = [])
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ systemml ---
[INFO] Add Source directory: <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/scala>
[INFO] Add Test Source directory: <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/test/scala>
[INFO] 
[INFO] --- maven-clean-plugin:3.0.0:clean (remove-antlr-tokens-files) @ systemml ---
[INFO] Deleting <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/java> (includes = [*.tokens], excludes = [])
[INFO] 
[INFO] --- antlr4-maven-plugin:4.5.3:antlr4 (antlr) @ systemml ---
[INFO] ANTLR 4: Processing source directory <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/java>
[INFO] Processing grammar: org/apache/sysml/parser/dml/Dml.g4
[INFO] Processing grammar: org/apache/sysml/parser/pydml/Pydml.g4
[INFO] 
[INFO] --- protoc-jar-maven-plugin:3.0.0-b2.1:run (caffe-sources) @ systemml ---
[INFO] Protoc version: 2.5.0
[INFO] Input directories:
[INFO]     <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/proto/caffe>
[INFO] Output targets:
[INFO]     java: <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/generated-sources> (add: main, clean: false)
[INFO] <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/generated-sources> does not exist. Creating...
[INFO]     Processing (java): caffe.proto
protoc-jar: protoc version: 250, detected platform: linux/amd64
protoc-jar: executing: [/tmp/protoc6290021253881745766.exe, -I<https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/proto/caffe,> --java_out=<https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/generated-sources,> <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/proto/caffe/caffe.proto]>
[INFO] Adding generated classes to classpath
[INFO] 
[INFO] --- protoc-jar-maven-plugin:3.0.0-b2.1:run (tf-sources) @ systemml ---
[INFO] Protoc version: 3.0.0
[INFO] Input directories:
[INFO]     <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/proto/tensorflow>
[INFO] Output targets:
[INFO]     java: <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/generated-sources> (add: main, clean: false)
[INFO]     Processing (java): event.proto
protoc-jar: protoc version: 300, detected platform: linux/amd64
protoc-jar: executing: [/tmp/protoc1814907759225159656.exe, -I<https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/proto/tensorflow,> --java_out=<https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/generated-sources,> <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/proto/tensorflow/event.proto]>
[INFO]     Processing (java): summary.proto
protoc-jar: protoc version: 300, detected platform: linux/amd64
protoc-jar: executing: [/tmp/protoc1960662075467847495.exe, -I<https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/proto/tensorflow,> --java_out=<https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/generated-sources,> <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/proto/tensorflow/summary.proto]>
[INFO] Adding generated classes to classpath
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.4:process (process-resource-bundles) @ systemml ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.4:process (default) @ systemml ---
[INFO] 
[INFO] --- maven-resources-plugin:2.7:resources (default-resources) @ systemml ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 121 resources to scripts
[INFO] Copying 2 resources to kernels
[INFO] Copying 2 resources to lib
[INFO] Copying 3 resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ systemml ---
[WARNING]  Expected all dependencies to require Scala version: 2.11.8
[WARNING]  com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.7
[WARNING] Multiple versions of scala libraries detected!
[INFO] <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/java>:-1: info: compiling
[INFO] <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/scala>:-1: info: compiling
[INFO] <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/generated-sources>:-1: info: compiling
[INFO] Compiling 1061 source files to <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/classes> at 1496647839607
[ERROR] <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/scala/org/apache/sysml/api/dl/Utils.scala>:240: error: value validateExternalFilename is not a member of object org.apache.sysml.runtime.util.LocalFileUtils
[ERROR] 			if( !LocalFileUtils.validateExternalFilename(filePath, true) )
[ERROR]                                             ^
[ERROR] <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/src/main/scala/org/apache/sysml/api/dl/Utils.scala>:246: error: value validateExternalFilename is not a member of object org.apache.sysml.runtime.util.LocalFileUtils
[ERROR] 			if( !LocalFileUtils.validateExternalFilename(filePath, false) )
[ERROR]                                             ^
[ERROR] two errors found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 29.922 s
[INFO] Finished at: 2017-06-05T02:30:50-05:00
[INFO] Final Memory: 71M/2618M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project systemml: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[JENKINS] Archiving <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/pom.xml> to org.apache.systemml/systemml/1.0.0-SNAPSHOT/systemml-1.0.0-SNAPSHOT.pom
channel stopped
Run condition [Always] enabling perform for step [[]]

Jenkins build is back to normal : SystemML-DailyTest #1034

Posted by je...@spark.tc.
See <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/1034/changes>


Build failed in Jenkins: SystemML-DailyTest #1033

Posted by je...@spark.tc.
See <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/1033/changes>

Changes:

[Matthias Boehm] [HOTFIX][SYSTEMML-1664] Fix scala dl-utils access to removed internals

------------------------------------------
[...truncated 30190 lines...]
17/06/05 16:31:55 INFO scheduler.DAGScheduler: running: Set(ShuffleMapStage 1)
17/06/05 16:31:55 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 2)
17/06/05 16:31:55 INFO scheduler.DAGScheduler: failed: Set()
17/06/05 16:31:55 INFO scheduler.DAGScheduler: ShuffleMapStage 1 (parallelizePairs at SparkExecutionContext.java:699) finished in 0.671 s
17/06/05 16:31:55 INFO scheduler.DAGScheduler: looking for newly runnable stages
17/06/05 16:31:55 INFO scheduler.DAGScheduler: running: Set()
17/06/05 16:31:55 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 2)
17/06/05 16:31:55 INFO scheduler.DAGScheduler: failed: Set()
17/06/05 16:31:55 INFO scheduler.DAGScheduler: Submitting ResultStage 2 (MapPartitionsRDD[5] at mapValues at BinarySPInstruction.java:117), which has no missing parents
17/06/05 16:31:55 INFO memory.MemoryStore: Block broadcast_2 stored as values in memory (estimated size 4.2 KB, free 1033.8 MB)
17/06/05 16:31:55 INFO memory.MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 2.3 KB, free 1033.8 MB)
17/06/05 16:31:55 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in memory on 169.54.146.43:34965 (size: 2.3 KB, free: 1033.8 MB)
17/06/05 16:31:55 INFO spark.SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:996
17/06/05 16:31:55 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (MapPartitionsRDD[5] at mapValues at BinarySPInstruction.java:117)
17/06/05 16:31:55 INFO scheduler.TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
17/06/05 16:31:55 INFO scheduler.FairSchedulableBuilder: Added task set TaskSet_2.0 tasks to pool default
17/06/05 16:31:55 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, localhost, executor driver, partition 0, PROCESS_LOCAL, 5813 bytes)
17/06/05 16:31:55 INFO executor.Executor: Running task 0.0 in stage 2.0 (TID 2)
17/06/05 16:31:55 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks
17/06/05 16:31:55 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 21 ms
17/06/05 16:31:55 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks
17/06/05 16:31:55 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 1 ms
17/06/05 16:31:55 INFO executor.Executor: Finished task 0.0 in stage 2.0 (TID 2). 2077 bytes result sent to driver
17/06/05 16:31:55 INFO scheduler.DAGScheduler: ResultStage 2 (collect at SparkExecutionContext.java:789) finished in 0.145 s
17/06/05 16:31:55 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 143 ms on localhost (executor driver) (1/1)
17/06/05 16:31:55 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool default
17/06/05 16:31:56 INFO storage.BlockManagerInfo: Removed broadcast_1_piece0 on 169.54.146.43:34965 in memory (size: 1305.0 B, free: 1033.8 MB)
17/06/05 16:31:56 INFO storage.BlockManagerInfo: Removed broadcast_0_piece0 on 169.54.146.43:34965 in memory (size: 1302.0 B, free: 1033.8 MB)
17/06/05 16:31:56 INFO storage.BlockManagerInfo: Removed broadcast_2_piece0 on 169.54.146.43:34965 in memory (size: 2.3 KB, free: 1033.8 MB)
17/06/05 16:31:56 INFO spark.ContextCleaner: Cleaned shuffle 0
17/06/05 16:31:56 INFO spark.ContextCleaner: Cleaned shuffle 1
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Registering RDD 7 (parallelizePairs at SparkExecutionContext.java:699)
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Registering RDD 6 (parallelizePairs at SparkExecutionContext.java:699)
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Got job 1 (collect at SparkExecutionContext.java:789) with 1 output partitions
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Final stage: ResultStage 5 (collect at SparkExecutionContext.java:789)
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Parents of final stage: List(ShuffleMapStage 3, ShuffleMapStage 4)
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Missing parents: List(ShuffleMapStage 3, ShuffleMapStage 4)
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 3 (ParallelCollectionRDD[7] at parallelizePairs at SparkExecutionContext.java:699), which has no missing parents
17/06/05 16:31:58 INFO memory.MemoryStore: Block broadcast_3 stored as values in memory (estimated size 2032.0 B, free 1033.8 MB)
17/06/05 16:31:58 INFO memory.MemoryStore: Block broadcast_3_piece0 stored as bytes in memory (estimated size 1305.0 B, free 1033.8 MB)
17/06/05 16:31:58 INFO storage.BlockManagerInfo: Added broadcast_3_piece0 in memory on 169.54.146.43:34965 (size: 1305.0 B, free: 1033.8 MB)
17/06/05 16:31:58 INFO spark.SparkContext: Created broadcast 3 from broadcast at DAGScheduler.scala:996
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 3 (ParallelCollectionRDD[7] at parallelizePairs at SparkExecutionContext.java:699)
17/06/05 16:31:58 INFO scheduler.TaskSchedulerImpl: Adding task set 3.0 with 1 tasks
17/06/05 16:31:58 INFO scheduler.FairSchedulableBuilder: Added task set TaskSet_3.0 tasks to pool default
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Submitting ShuffleMapStage 4 (ParallelCollectionRDD[6] at parallelizePairs at SparkExecutionContext.java:699), which has no missing parents
17/06/05 16:31:58 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 3.0 (TID 3, localhost, executor driver, partition 0, PROCESS_LOCAL, 6445 bytes)
17/06/05 16:31:58 INFO executor.Executor: Running task 0.0 in stage 3.0 (TID 3)
17/06/05 16:31:58 INFO memory.MemoryStore: Block broadcast_4 stored as values in memory (estimated size 2032.0 B, free 1033.8 MB)
17/06/05 16:31:58 INFO memory.MemoryStore: Block broadcast_4_piece0 stored as bytes in memory (estimated size 1305.0 B, free 1033.8 MB)
17/06/05 16:31:58 INFO storage.BlockManagerInfo: Added broadcast_4_piece0 in memory on 169.54.146.43:34965 (size: 1305.0 B, free: 1033.8 MB)
17/06/05 16:31:58 INFO spark.SparkContext: Created broadcast 4 from broadcast at DAGScheduler.scala:996
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 4 (ParallelCollectionRDD[6] at parallelizePairs at SparkExecutionContext.java:699)
17/06/05 16:31:58 INFO scheduler.TaskSchedulerImpl: Adding task set 4.0 with 1 tasks
17/06/05 16:31:58 INFO executor.Executor: Finished task 0.0 in stage 3.0 (TID 3). 1253 bytes result sent to driver
17/06/05 16:31:58 INFO scheduler.FairSchedulableBuilder: Added task set TaskSet_4.0 tasks to pool default
17/06/05 16:31:58 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 4.0 (TID 4, localhost, executor driver, partition 0, PROCESS_LOCAL, 6445 bytes)
17/06/05 16:31:58 INFO executor.Executor: Running task 0.0 in stage 4.0 (TID 4)
17/06/05 16:31:58 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 3.0 (TID 3) in 40 ms on localhost (executor driver) (1/1)
17/06/05 16:31:58 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool default
17/06/05 16:31:58 INFO scheduler.DAGScheduler: ShuffleMapStage 3 (parallelizePairs at SparkExecutionContext.java:699) finished in 0.040 s
17/06/05 16:31:58 INFO scheduler.DAGScheduler: looking for newly runnable stages
17/06/05 16:31:58 INFO scheduler.DAGScheduler: running: Set(ShuffleMapStage 4)
17/06/05 16:31:58 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 5)
17/06/05 16:31:58 INFO scheduler.DAGScheduler: failed: Set()
17/06/05 16:31:58 INFO executor.Executor: Finished task 0.0 in stage 4.0 (TID 4). 1253 bytes result sent to driver
17/06/05 16:31:58 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 4.0 (TID 4) in 23 ms on localhost (executor driver) (1/1)
17/06/05 16:31:58 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 4.0, whose tasks have all completed, from pool default
17/06/05 16:31:58 INFO scheduler.DAGScheduler: ShuffleMapStage 4 (parallelizePairs at SparkExecutionContext.java:699) finished in 0.036 s
17/06/05 16:31:58 INFO scheduler.DAGScheduler: looking for newly runnable stages
17/06/05 16:31:58 INFO scheduler.DAGScheduler: running: Set()
17/06/05 16:31:58 INFO scheduler.DAGScheduler: waiting: Set(ResultStage 5)
17/06/05 16:31:58 INFO scheduler.DAGScheduler: failed: Set()
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Submitting ResultStage 5 (MapPartitionsRDD[11] at mapValues at BinarySPInstruction.java:117), which has no missing parents
17/06/05 16:31:58 INFO memory.MemoryStore: Block broadcast_5 stored as values in memory (estimated size 4.2 KB, free 1033.8 MB)
17/06/05 16:31:58 INFO memory.MemoryStore: Block broadcast_5_piece0 stored as bytes in memory (estimated size 2.3 KB, free 1033.8 MB)
17/06/05 16:31:58 INFO storage.BlockManagerInfo: Added broadcast_5_piece0 in memory on 169.54.146.43:34965 (size: 2.3 KB, free: 1033.8 MB)
17/06/05 16:31:58 INFO spark.SparkContext: Created broadcast 5 from broadcast at DAGScheduler.scala:996
17/06/05 16:31:58 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from ResultStage 5 (MapPartitionsRDD[11] at mapValues at BinarySPInstruction.java:117)
17/06/05 16:31:58 INFO scheduler.TaskSchedulerImpl: Adding task set 5.0 with 1 tasks
17/06/05 16:31:58 INFO scheduler.FairSchedulableBuilder: Added task set TaskSet_5.0 tasks to pool default
17/06/05 16:31:58 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 5.0 (TID 5, localhost, executor driver, partition 0, PROCESS_LOCAL, 5814 bytes)
17/06/05 16:31:58 INFO executor.Executor: Running task 0.0 in stage 5.0 (TID 5)
17/06/05 16:31:58 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks
17/06/05 16:31:58 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
17/06/05 16:31:58 INFO storage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 1 blocks
17/06/05 16:31:58 INFO storage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms
17/06/05 16:31:58 INFO executor.Executor: Finished task 0.0 in stage 5.0 (TID 5). 2240 bytes result sent to driver
17/06/05 16:31:58 INFO scheduler.TaskSetManager: Finished task 0.0 in stage 5.0 (TID 5) in 24 ms on localhost (executor driver) (1/1)
17/06/05 16:31:58 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 5.0, whose tasks have all completed, from pool default
17/06/05 16:31:58 INFO scheduler.DAGScheduler: ResultStage 5 (collect at SparkExecutionContext.java:789) finished in 0.027 s
17/06/05 16:32:01 INFO storage.BlockManagerInfo: Removed broadcast_4_piece0 on 169.54.146.43:34965 in memory (size: 1305.0 B, free: 1033.8 MB)
17/06/05 16:32:01 INFO storage.BlockManagerInfo: Removed broadcast_3_piece0 on 169.54.146.43:34965 in memory (size: 1305.0 B, free: 1033.8 MB)
17/06/05 16:32:01 INFO storage.BlockManagerInfo: Removed broadcast_5_piece0 on 169.54.146.43:34965 in memory (size: 2.3 KB, free: 1033.8 MB)
17/06/05 16:32:01 INFO spark.ContextCleaner: Cleaned shuffle 2
17/06/05 16:32:01 INFO spark.ContextCleaner: Cleaned shuffle 3
Running org.apache.sysml.test.integration.functions.quaternary.WeightedDivMatrixMultTest
Tests run: 90, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2,132.629 sec - in org.apache.sysml.test.integration.functions.quaternary.WeightedDivMatrixMultTest
17/06/05 16:32:53 INFO util.ShutdownHookManager: Shutdown hook called
17/06/05 16:32:53 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-6f10b1bc-ef98-4043-b82d-fd827bee1cd2
Running org.apache.sysml.test.integration.applications.descriptivestats.UnivariateWeightedScaleDenseTest
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 455.185 sec - in org.apache.sysml.test.integration.applications.descriptivestats.UnivariateWeightedScaleDenseTest
17/06/05 16:36:14 INFO server.ServerConnector: Stopped ServerConnector@546659ae{HTTP/1.1}{0.0.0.0:4040}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@564be3dd{/stages/stage/kill,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@ed0a4cd{/jobs/job/kill,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@24e65670{/api,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@d92b612{/,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@372deddf{/static,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5b0e324b{/executors/threadDump/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@7fbaccfc{/executors/threadDump,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2cc2c8e0{/executors/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4f947b7e{/executors,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4b5de9a1{/environment/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@21fe5aae{/environment,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@178e8773{/storage/rdd/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@727ec662{/storage/rdd,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@12bf1ede{/storage/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6b4ba1a5{/storage,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4ec57a04{/stages/pool/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@f7dfb90{/stages/pool,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@5b34572d{/stages/stage/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1e676990{/stages/stage,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4e31f912{/stages/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@250e2271{/stages,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@264c6b4a{/jobs/job/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@16b8d7cb{/jobs/job,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@39ca43c9{/jobs/json,null,UNAVAILABLE}
17/06/05 16:36:14 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@34a6fcfe{/jobs,null,UNAVAILABLE}
17/06/05 16:36:14 INFO ui.SparkUI: Stopped Spark web UI at http://169.54.146.43:4040
17/06/05 16:36:14 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/06/05 16:36:14 INFO memory.MemoryStore: MemoryStore cleared
17/06/05 16:36:14 INFO storage.BlockManager: BlockManager stopped
17/06/05 16:36:14 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
17/06/05 16:36:14 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/06/05 16:36:14 INFO spark.SparkContext: Successfully stopped SparkContext
Running org.apache.sysml.test.integration.scripts.nn.NNTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1,074.691 sec - in org.apache.sysml.test.integration.scripts.nn.NNTest
17/06/05 16:36:14 INFO util.ShutdownHookManager: Shutdown hook called
17/06/05 16:36:14 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-4a3488e5-c388-4002-864d-3792ce2a8fb9
Running org.apache.sysml.test.integration.applications.parfor.ParForNaiveBayesTest
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 811.689 sec - in org.apache.sysml.test.integration.applications.parfor.ParForNaiveBayesTest
Running org.apache.sysml.test.integration.applications.descriptivestats.UnivariateUnweightedScaleSparseTest
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 385.795 sec - in org.apache.sysml.test.integration.applications.descriptivestats.UnivariateUnweightedScaleSparseTest
Running org.apache.sysml.test.integration.functions.append.AppendChainTest
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1,683.16 sec - in org.apache.sysml.test.integration.functions.append.AppendChainTest
Running org.apache.sysml.test.integration.applications.parfor.ParForSampleTest
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 180.283 sec - in org.apache.sysml.test.integration.applications.parfor.ParForSampleTest
Running org.apache.sysml.test.integration.applications.parfor.ParForUnivariateStatsTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.222 sec - in org.apache.sysml.test.integration.applications.parfor.ParForUnivariateStatsTest
Running org.apache.sysml.test.integration.applications.descriptivestats.UnivariateWeightedScaleSparseTest
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 449.993 sec - in org.apache.sysml.test.integration.applications.descriptivestats.UnivariateWeightedScaleSparseTest
Running org.apache.sysml.test.integration.applications.dml.HITSDMLTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 178.5 sec - in org.apache.sysml.test.integration.applications.dml.HITSDMLTest
Running org.apache.sysml.test.integration.functions.append.AppendMatrixTest
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2,521.625 sec - in org.apache.sysml.test.integration.functions.append.AppendMatrixTest
Running org.apache.sysml.test.integration.applications.pydml.ArimaPyDMLTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.009 sec - in org.apache.sysml.test.integration.applications.pydml.ArimaPyDMLTest
Running org.apache.sysml.test.integration.applications.pydml.HITSPyDMLTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 77.176 sec - in org.apache.sysml.test.integration.applications.pydml.HITSPyDMLTest
Running org.apache.sysml.test.integration.applications.dml.LinearRegressionDMLTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 73.446 sec - in org.apache.sysml.test.integration.applications.dml.LinearRegressionDMLTest
Running org.apache.sysml.test.integration.applications.pydml.CsplineDSPyDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.536 sec - in org.apache.sysml.test.integration.applications.pydml.CsplineDSPyDMLTest
Running org.apache.sysml.test.integration.applications.pydml.LinearLogRegPyDMLTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 85.442 sec - in org.apache.sysml.test.integration.applications.pydml.LinearLogRegPyDMLTest
Running org.apache.sysml.test.integration.applications.pydml.NaiveBayesPyDMLTest
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.667 sec - in org.apache.sysml.test.integration.applications.pydml.NaiveBayesPyDMLTest
Running org.apache.sysml.test.integration.applications.dml.CsplineCGDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.291 sec - in org.apache.sysml.test.integration.applications.dml.CsplineCGDMLTest
Running org.apache.sysml.test.integration.applications.dml.L2SVMDMLTest
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 99.62 sec - in org.apache.sysml.test.integration.applications.dml.L2SVMDMLTest
Running org.apache.sysml.test.integration.applications.pydml.MDABivariateStatsPyDMLTest
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 293.13 sec - in org.apache.sysml.test.integration.applications.pydml.MDABivariateStatsPyDMLTest
Running org.apache.sysml.test.integration.applications.pydml.PageRankPyDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.072 sec - in org.apache.sysml.test.integration.applications.pydml.PageRankPyDMLTest
Running org.apache.sysml.test.integration.applications.pydml.NaiveBayesParforPyDMLTest
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.619 sec - in org.apache.sysml.test.integration.applications.pydml.NaiveBayesParforPyDMLTest
Running org.apache.sysml.test.integration.applications.dml.NaiveBayesDMLTest
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 92.045 sec - in org.apache.sysml.test.integration.applications.dml.NaiveBayesDMLTest
Running org.apache.sysml.test.integration.applications.dml.MDABivariateStatsDMLTest
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 370.241 sec - in org.apache.sysml.test.integration.applications.dml.MDABivariateStatsDMLTest
Running org.apache.sysml.test.integration.applications.dml.PageRankDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.09 sec - in org.apache.sysml.test.integration.applications.dml.PageRankDMLTest
Running org.apache.sysml.test.integration.applications.pydml.GLMPyDMLTest
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 581.003 sec - in org.apache.sysml.test.integration.applications.pydml.GLMPyDMLTest
Running org.apache.sysml.test.integration.applications.pydml.MultiClassSVMPyDMLTest
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 88.817 sec - in org.apache.sysml.test.integration.applications.pydml.MultiClassSVMPyDMLTest
Running org.apache.sysml.test.integration.applications.dml.ID3DMLTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 698.051 sec - in org.apache.sysml.test.integration.applications.dml.ID3DMLTest
Running org.apache.sysml.test.integration.applications.dml.GLMDMLTest
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1,605.525 sec - in org.apache.sysml.test.integration.applications.dml.GLMDMLTest
Running org.apache.sysml.test.integration.applications.pydml.CsplineCGPyDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.069 sec - in org.apache.sysml.test.integration.applications.pydml.CsplineCGPyDMLTest
Running org.apache.sysml.test.integration.applications.pydml.WelchTPyDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.399 sec - in org.apache.sysml.test.integration.applications.pydml.WelchTPyDMLTest
Running org.apache.sysml.test.integration.applications.dml.MultiClassSVMDMLTest
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 100.782 sec - in org.apache.sysml.test.integration.applications.dml.MultiClassSVMDMLTest
Running org.apache.sysml.test.integration.applications.dml.CsplineDSDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.199 sec - in org.apache.sysml.test.integration.applications.dml.CsplineDSDMLTest
Running org.apache.sysml.test.integration.applications.pydml.ApplyTransformPyDMLTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.8 sec - in org.apache.sysml.test.integration.applications.pydml.ApplyTransformPyDMLTest
Running org.apache.sysml.test.integration.applications.dml.NaiveBayesParforDMLTest
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 82.24 sec - in org.apache.sysml.test.integration.applications.dml.NaiveBayesParforDMLTest
Running org.apache.sysml.test.integration.applications.pydml.L2SVMPyDMLTest
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.761 sec - in org.apache.sysml.test.integration.applications.pydml.L2SVMPyDMLTest
Running org.apache.sysml.test.integration.applications.pydml.LinearRegressionPyDMLTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 51.854 sec - in org.apache.sysml.test.integration.applications.pydml.LinearRegressionPyDMLTest
Running org.apache.sysml.test.integration.applications.dml.LinearLogRegDMLTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 105.843 sec - in org.apache.sysml.test.integration.applications.dml.LinearLogRegDMLTest
Running org.apache.sysml.test.integration.applications.dml.GNMFDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 148.578 sec - in org.apache.sysml.test.integration.applications.dml.GNMFDMLTest
Running org.apache.sysml.test.integration.applications.dml.WelchTDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.583 sec - in org.apache.sysml.test.integration.applications.dml.WelchTDMLTest
Running org.apache.sysml.test.integration.applications.dml.ApplyTransformDMLTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.742 sec - in org.apache.sysml.test.integration.applications.dml.ApplyTransformDMLTest
Running org.apache.sysml.test.integration.applications.pydml.ID3PyDMLTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 379.972 sec - in org.apache.sysml.test.integration.applications.pydml.ID3PyDMLTest
Running org.apache.sysml.test.integration.applications.pydml.GNMFPyDMLTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.765 sec - in org.apache.sysml.test.integration.applications.pydml.GNMFPyDMLTest

Results :

Failed tests: 
  FrameMatrixReblockTest.testFrameWriteMultipleSparseBinarySpark:170->runFrameReblockTest:230 34 values are not in equal

Tests run: 7111, Failures: 1, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.17:verify (default) @ systemml ---
[INFO] Failsafe report directory: <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:14 h
[INFO] Finished at: 2017-06-05T17:50:39-05:00
[INFO] Final Memory: 61M/2581M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.17:verify (default) on project systemml: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://sparktc.ibmcloud.com/jenkins/job/SystemML-DailyTest/ws/target/failsafe-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Run condition [Always] enabling perform for step [[]]