You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:16:17 UTC
[jira] [Resolved] (SPARK-19288) Failure (at test_sparkSQL.R#1300):
date functions on a DataFrame in R/run-tests.sh
[ https://issues.apache.org/jira/browse/SPARK-19288?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-19288.
----------------------------------
Resolution: Incomplete
> Failure (at test_sparkSQL.R#1300): date functions on a DataFrame in R/run-tests.sh
> ----------------------------------------------------------------------------------
>
> Key: SPARK-19288
> URL: https://issues.apache.org/jira/browse/SPARK-19288
> Project: Spark
> Issue Type: Bug
> Components: SparkR, SQL, Tests
> Affects Versions: 2.0.1
> Environment: Ubuntu 16.04, X86_64, ppc64le
> Reporter: Nirman Narang
> Priority: Major
> Labels: bulk-closed
>
> Full log here.
> {code:title=R/run-tests.sh|borderStyle=solid}
> Loading required package: methods
> Attaching package: 'SparkR'
> The following object is masked from 'package:testthat':
> describe
> The following objects are masked from 'package:stats':
> cov, filter, lag, na.omit, predict, sd, var, window
> The following objects are masked from 'package:base':
> as.data.frame, colnames, colnames<-, drop, intersect, rank, rbind,
> sample, subset, summary, transform, union
> functions on binary files : Spark package found in SPARK_HOME: /var/lib/jenkins/workspace/Sparkv2.0.1/spark
> ....
> binary functions : ...........
> broadcast variables : ..
> functions in client.R : .....
> test functions in sparkR.R : .....Re-using existing Spark Context. Call sparkR.session.stop() or restart R to create a new Spark Context
> ...............
> include R packages : Spark package found in SPARK_HOME: /var/lib/jenkins/workspace/Sparkv2.0.1/spark
> JVM API : ..
> MLlib functions : Spark package found in SPARK_HOME: /var/lib/jenkins/workspace/Sparkv2.0.1/spark
> .........................SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
> .Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:54 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 65,622
> Jan 19, 2017 5:40:54 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 70B for [label] BINARY: 1 values, 21B raw, 23B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:54 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 87B for [terms, list, element, list, element] BINARY: 2 values, 42B raw, 43B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:54 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 30B for [hasIntercept] BOOLEAN: 1 values, 1B raw, 3B comp, 1 pages, encodings: [PLAIN, BIT_PACKED]
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 49
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 90B for [labels, list, element] BINARY: 3 values, 50B raw, 50B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 92
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 61B for [vectorCol] BINARY: 1 values, 18B raw, 20B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 126B for [prefixesToRewrite, key_value, key] BINARY: 2 values, 61B raw, 61B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 58B for [prefixesToRewrite, key_value, value] BINARY: 2 values, 15B raw, 17B comp, 1 pages, encodings: [PLAIN_DICTIONARY, RLE], dic { 1 entries, 12B raw, 1B comp}
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 54
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 122B for [columnsToPrune, list, element] BINARY: 2 values, 59B raw, 59B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 56
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 51B for [intercept] DOUBLE: 1 values, 8B raw, 10B comp, 1 pages, encodings: [PLAIN, BIT_PACKED]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 45B for [coefficients, type] INT32: 1 values, 10B raw, 12B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 30B for [coefficients, size] INT32: 1 values, 7B raw, 9B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 36B for [coefficients, indices, list, element] INT32: 1 values, 13B raw, 15B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 79B for [coefficients, values, list, element] DOUBLE: 3 values, 37B raw, 38B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 65,622
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 70B for [label] BINARY: 1 values, 21B raw, 23B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 87B for [terms, list, element, list, element] BINARY: 2 values, 42B raw, 43B comp, 1 ................................................
> [Stage 419:===================================================> (194 + 4) / 200]
>
> .........................
> parallelize() and collect() : .............................
> basic RDD functions : ...........................................................................................................................................................................................................................................................................................................................................................................................................................................
> [Stage 906:===> (7 + 4) / 100]
> [Stage 906:=====> (11 + 4) / 100]
> [Stage 906:=======> (14 + 4) / 100]
> [Stage 906:========> (16 + 4) / 100]
> [Stage 906:==========> (19 + 4) / 100]
> [Stage 906:============> (23 + 4) / 100]
> [Stage 906:==============> (27 + 4) / 100]
> [Stage 906:================> (31 + 4) / 100]
> [Stage 906:=================> (32 + 4) / 100]
> [Stage 906:==================> (35 + 4) / 100]
> [Stage 906:=====================> (39 + 4) / 100]
> [Stage 906:======================> (41 + 4) / 100]
> [Stage 906:=======================> (44 + 4) / 100]
> [Stage 906:=========================> (47 + 4) / 100]
> [Stage 906:===========================> (51 + 4) / 100]
> [Stage 906:=============================> (54 + 4) / 100]
> [Stage 906:=============================> (55 + 4) / 100]
> [Stage 906:===============================> (59 + 4) / 100]
> [Stage 906:================================> (61 + 4) / 100]
> [Stage 906:==================================> (64 + 4) / 100]
> [Stage 906:====================================> (67 + 4) / 100]
> [Stage 906:=====================================> (69 + 4) / 100]
> [Stage 906:======================================> (72 + 4) / 100]
> [Stage 906:=======================================> (74 + 4) / 100]
> [Stage 906:=========================================> (76 + 4) / 100]
> [Stage 906:===========================================> (80 + 4) / 100]
> [Stage 906:===========================================> (81 + 4) / 100]
> [Stage 906:=============================================> (84 + 4) / 100]
> [Stage 906:===============================================> (88 + 4) / 100]
> [Stage 906:=================================================> (91 + 4) / 100]
> [Stage 906:===================================================> (95 + 4) / 100]
> [Stage 906:====================================================> (97 + 3) / 100]
>
> .
> SerDe functionality : ...................
> partitionBy, groupByKey, reduceByKey etc. : ....................
> SparkSQL functions : .........................................................S.........................................................................................................................................................................................................
> [Stage 1148:============================================> (170 + 4) / 200]
>
> .......................................................................S....................................................................................1..................................................
> [Stage 1405:============================================> (170 + 4) / 200]
>
> ................................
> [Stage 1474:============================================> (173 + 4) / 200]
>
> ..................................................................................S..........................................................................................................................
> [Stage 1793:==> (9 + 4) / 200]
> [Stage 1793:===> (12 + 4) / 200]
> [Stage 1793:===> (14 + 4) / 200]
> [Stage 1793:====> (16 + 4) / 200]
> [Stage 1793:=====> (20 + 4) / 200]
> [Stage 1793:======> (23 + 4) / 200]
> [Stage 1793:======> (24 + 4) / 200]
> [Stage 1793:=======> (28 + 4) / 200]
> [Stage 1793:========> (31 + 4) / 200]
> [Stage 1793:========> (33 + 4) / 200]
> [Stage 1793:=========> (36 + 4) / 200]
> [Stage 1793:==========> (40 + 4) / 200]
> [Stage 1793:===========> (42 + 4) / 200]
> [Stage 1793:============> (46 + 4) / 200]
> [Stage 1793:============> (49 + 4) / 200]
> [Stage 1793:=============> (51 + 4) / 200]
> [Stage 1793:==============> (55 + 4) / 200]
> [Stage 1793:===============> (58 + 4) / 200]
> [Stage 1793:================> (61 + 4) / 200]
> [Stage 1793:================> (64 + 4) / 200]
> [Stage 1793:=================> (67 + 4) / 200]
> [Stage 1793:==================> (69 + 4) / 200]
> [Stage 1793:===================> (73 + 4) / 200]
> [Stage 1793:====================> (76 + 4) / 200]
> [Stage 1793:====================> (79 + 4) / 200]
> [Stage 1793:=====================> (83 + 4) / 200]
> [Stage 1793:======================> (86 + 4) / 200]
> [Stage 1793:=======================> (88 + 4) / 200]
> [Stage 1793:========================> (91 + 4) / 200]
> [Stage 1793:========================> (94 + 4) / 200]
> [Stage 1793:=========================> (97 + 4) / 200]
> [Stage 1793:==========================> (99 + 4) / 200]
> [Stage 1793:==========================> (103 + 4) / 200]
> [Stage 1793:===========================> (105 + 4) / 200]
> [Stage 1793:============================> (108 + 4) / 200]
> [Stage 1793:============================> (110 + 4) / 200]
> [Stage 1793:=============================> (114 + 4) / 200]
> [Stage 1793:==============================> (117 + 4) / 200]
> [Stage 1793:==============================> (119 + 4) / 200]
> [Stage 1793:===============================> (123 + 4) / 200]
> [Stage 1793:=================================> (127 + 4) / 200]
> [Stage 1793:=================================> (129 + 4) / 200]
> [Stage 1793:==================================> (131 + 4) / 200]
> [Stage 1793:==================================> (134 + 4) / 200]
> [Stage 1793:===================================> (138 + 4) / 200]
> [Stage 1793:====================================> (141 + 4) / 200]
> [Stage 1793:=====================================> (143 + 4) / 200]
> [Stage 1793:=====================================> (146 + 4) / 200]
> [Stage 1793:=======================================> (150 + 4) / 200]
> [Stage 1793:========================================> (154 + 4) / 200]
> [Stage 1793:=========================================> (158 + 4) / 200]
> [Stage 1793:=========================================> (159 + 4) / 200]
> [Stage 1793:==========================================> (162 + 4) / 200]
> [Stage 1793:===========================================> (166 + 4) / 200]
> [Stage 1793:============================================> (170 + 4) / 200]
> [Stage 1793:============================================> (172 + 4) / 200]
> [Stage 1793:=============================================> (174 + 4) / 200]
> [Stage 1793:==============================================> (178 + 4) / 200]
> [Stage 1793:==============================================> (180 + 4) / 200]
> [Stage 1793:===============================================> (182 + 4) / 200]
> [Stage 1793:================================================> (186 + 4) / 200]
> [Stage 1793:=================================================> (189 + 4) / 200]
> [Stage 1793:=================================================> (191 + 4) / 200]
> [Stage 1793:==================================================> (194 + 4) / 200]
> [Stage 1793:===================================================>(198 + 2) / 200]
>
> .
> [Stage 1798:=> (7 + 4) / 200]
> [Stage 1798:==> (9 + 4) / 200]
> [Stage 1798:==> (11 + 4) / 200]
> [Stage 1798:===> (15 + 4) / 200]
> [Stage 1798:====> (18 + 4) / 200]
> [Stage 1798:=====> (20 + 4) / 200]
> [Stage 1798:======> (23 + 4) / 200]
> [Stage 1798:=======> (27 + 4) / 200]
> [Stage 1798:=======> (28 + 4) / 200]
> [Stage 1798:========> (31 + 4) / 200]
> [Stage 1798:=========> (35 + 4) / 200]
> [Stage 1798:=========> (37 + 4) / 200]
> [Stage 1798:==========> (40 + 4) / 200]
> [Stage 1798:===========> (43 + 4) / 200]
> [Stage 1798:============> (47 + 4) / 200]
> [Stage 1798:============> (49 + 4) / 200]
> [Stage 1798:=============> (52 + 4) / 200]
> [Stage 1798:==============> (55 + 4) / 200]
> [Stage 1798:===============> (59 + 4) / 200]
> [Stage 1798:================> (61 + 4) / 200]
> [Stage 1798:================> (64 + 4) / 200]
> [Stage 1798:=================> (67 + 4) / 200]
> [Stage 1798:==================> (70 + 4) / 200]
> [Stage 1798:===================> (74 + 4) / 200]
> [Stage 1798:====================> (76 + 4) / 200]
> [Stage 1798:=====================> (80 + 4) / 200]
> [Stage 1798:=====================> (83 + 4) / 200]
> [Stage 1798:======================> (84 + 4) / 200]
> [Stage 1798:=======================> (88 + 4) / 200]
> [Stage 1798:========================> (91 + 4) / 200]
> [Stage 1798:========================> (92 + 4) / 200]
> [Stage 1798:=========================> (96 + 4) / 200]
> [Stage 1798:==========================> (99 + 4) / 200]
> [Stage 1798:==========================> (101 + 4) / 200]
> [Stage 1798:===========================> (104 + 4) / 200]
> [Stage 1798:===========================> (107 + 4) / 200]
> [Stage 1798:============================> (111 + 4) / 200]
> [Stage 1798:=============================> (115 + 4) / 200]
> [Stage 1798:==============================> (116 + 4) / 200]
> [Stage 1798:===============================> (120 + 4) / 200]
> [Stage 1798:===============================> (123 + 4) / 200]
> [Stage 1798:=================================> (127 + 4) / 200]
> [Stage 1798:=================================> (129 + 4) / 200]
> [Stage 1798:==================================> (133 + 4) / 200]
> [Stage 1798:===================================> (136 + 4) / 200]
> [Stage 1798:===================================> (138 + 4) / 200]
> [Stage 1798:====================================> (139 + 4) / 200]
> [Stage 1798:====================================> (142 + 4) / 200]
> [Stage 1798:=====================================> (146 + 4) / 200]
> [Stage 1798:=======================================> (150 + 4) / 200]
> [Stage 1798:=======================================> (153 + 4) / 200]
> [Stage 1798:========================================> (154 + 4) / 200]
> [Stage 1798:=========================================> (158 + 4) / 200]
> [Stage 1798:==========================================> (162 + 4) / 200]
> [Stage 1798:==========================================> (164 + 4) / 200]
> [Stage 1798:===========================================> (168 + 4) / 200]
> [Stage 1798:============================================> (170 + 4) / 200]
> [Stage 1798:=============================================> (174 + 4) / 200]
> [Stage 1798:==============================================> (177 + 4) / 200]
> [Stage 1798:==============================================> (180 + 4) / 200]
> [Stage 1798:===============================================> (182 + 4) / 200]
> [Stage 1798:===============================================> (184 + 4) / 200]
> [Stage 1798:================================================> (187 + 4) / 200]
> [Stage 1798:=================================================> (192 + 4) / 200]
> [Stage 1798:==================================================> (193 + 4) / 200]
> [Stage 1798:===================================================>(197 + 3) / 200]
>
> .
> [Stage 1800:=> (7 + 4) / 200]
> [Stage 1800:==> (10 + 4) / 200]
> [Stage 1800:===> (14 + 4) / 200]
> [Stage 1800:====> (16 + 4) / 200]
> [Stage 1800:=====> (20 + 4) / 200]
> [Stage 1800:=====> (22 + 4) / 200]
> [Stage 1800:======> (26 + 4) / 200]
> [Stage 1800:=======> (30 + 4) / 200]
> [Stage 1800:=========> (34 + 4) / 200]
> [Stage 1800:==========> (39 + 4) / 200]
> [Stage 1800:===========> (44 + 4) / 200]
> [Stage 1800:============> (47 + 4) / 200]
> [Stage 1800:==============> (53 + 4) / 200]
> [Stage 1800:==============> (56 + 4) / 200]
> [Stage 1800:===============> (60 + 4) / 200]
> [Stage 1800:================> (64 + 4) / 200]
> [Stage 1800:==================> (68 + 4) / 200]
> [Stage 1800:==================> (71 + 4) / 200]
> [Stage 1800:===================> (74 + 4) / 200]
> [Stage 1800:====================> (78 + 4) / 200]
> [Stage 1800:=====================> (83 + 4) / 200]
> [Stage 1800:=======================> (87 + 4) / 200]
> [Stage 1800:========================> (91 + 4) / 200]
> [Stage 1800:=========================> (97 + 4) / 200]
> [Stage 1800:==========================> (101 + 4) / 200]
> [Stage 1800:===========================> (106 + 4) / 200]
> [Stage 1800:============================> (111 + 4) / 200]
> [Stage 1800:=============================> (114 + 4) / 200]
> [Stage 1800:==============================> (118 + 4) / 200]
> [Stage 1800:===============================> (121 + 4) / 200]
> [Stage 1800:================================> (125 + 4) / 200]
> [Stage 1800:=================================> (128 + 4) / 200]
> [Stage 1800:==================================> (132 + 4) / 200]
> [Stage 1800:===================================> (136 + 4) / 200]
> [Stage 1800:====================================> (139 + 4) / 200]
> [Stage 1800:====================================> (141 + 4) / 200]
> [Stage 1800:=====================================> (144 + 4) / 200]
> [Stage 1800:======================================> (148 + 4) / 200]
> [Stage 1800:=======================================> (150 + 4) / 200]
> [Stage 1800:=======================================> (153 + 4) / 200]
> [Stage 1800:=========================================> (159 + 4) / 200]
> [Stage 1800:==========================================> (163 + 4) / 200]
> [Stage 1800:===========================================> (167 + 4) / 200]
> [Stage 1800:============================================> (172 + 4) / 200]
> [Stage 1800:=============================================> (176 + 4) / 200]
> [Stage 1800:==============================================> (179 + 4) / 200]
> [Stage 1800:===============================================> (184 + 4) / 200]
> [Stage 1800:=================================================> (189 + 4) / 200]
> [Stage 1800:==================================================> (194 + 4) / 200]
> [Stage 1800:===================================================>(198 + 2) / 200]
>
> .
> [Stage 1802:===> (14 + 4) / 200]
> [Stage 1802:====> (18 + 4) / 200]
> [Stage 1802:======> (23 + 5) / 200]
> [Stage 1802:=======> (29 + 4) / 200]
> [Stage 1802:========> (33 + 4) / 200]
> [Stage 1802:=========> (37 + 4) / 200]
> [Stage 1802:===========> (42 + 4) / 200]
> [Stage 1802:============> (46 + 4) / 200]
> [Stage 1802:=============> (50 + 4) / 200]
> [Stage 1802:==============> (54 + 4) / 200]
> [Stage 1802:===============> (58 + 4) / 200]
> [Stage 1802:================> (62 + 4) / 200]
> [Stage 1802:==================> (68 + 4) / 200]
> [Stage 1802:===================> (72 + 4) / 200]
> [Stage 1802:====================> (78 + 4) / 200]
> [Stage 1802:======================> (84 + 4) / 200]
> [Stage 1802:=======================> (88 + 4) / 200]
> [Stage 1802:========================> (94 + 4) / 200]
> [Stage 1802:==========================> (99 + 4) / 200]
> [Stage 1802:==========================> (103 + 4) / 200]
> [Stage 1802:============================> (109 + 4) / 200]
> [Stage 1802:=============================> (113 + 4) / 200]
> [Stage 1802:==============================> (118 + 4) / 200]
> [Stage 1802:===============================> (122 + 4) / 200]
> [Stage 1802:================================> (126 + 4) / 200]
> [Stage 1802:==================================> (132 + 4) / 200]
> [Stage 1802:===================================> (136 + 4) / 200]
> [Stage 1802:====================================> (141 + 4) / 200]
> [Stage 1802:=====================================> (145 + 4) / 200]
> [Stage 1802:=======================================> (151 + 4) / 200]
> [Stage 1802:========================================> (155 + 4) / 200]
> [Stage 1802:=========================================> (161 + 4) / 200]
> [Stage 1802:===========================================> (166 + 4) / 200]
> [Stage 1802:============================================> (170 + 4) / 200]
> [Stage 1802:=============================================> (174 + 4) / 200]
> [Stage 1802:===============================================> (182 + 4) / 200]
> [Stage 1802:================================================> (186 + 4) / 200]
> [Stage 1802:=================================================> (190 + 4) / 200]
> [Stage 1802:==================================================> (194 + 4) / 200]
>
> .
> [Stage 1804:==> (11 + 5) / 200]
> [Stage 1804:===> (15 + 4) / 200]
> [Stage 1804:=====> (19 + 4) / 200]
> [Stage 1804:======> (23 + 4) / 200]
> [Stage 1804:=======> (27 + 4) / 200]
> [Stage 1804:========> (32 + 4) / 200]
> [Stage 1804:=========> (36 + 4) / 200]
> [Stage 1804:==========> (40 + 4) / 200]
> [Stage 1804:===========> (45 + 4) / 200]
> [Stage 1804:============> (49 + 4) / 200]
> [Stage 1804:==============> (53 + 4) / 200]
> [Stage 1804:===============> (58 + 4) / 200]
> [Stage 1804:================> (63 + 4) / 200]
> [Stage 1804:=================> (67 + 4) / 200]
> [Stage 1804:===================> (72 + 4) / 200]
> [Stage 1804:====================> (76 + 4) / 200]
> [Stage 1804:====================> (79 + 4) / 200]
> [Stage 1804:=====================> (83 + 4) / 200]
> [Stage 1804:=======================> (88 + 4) / 200]
> [Stage 1804:========================> (94 + 4) / 200]
> [Stage 1804:=========================> (97 + 4) / 200]
> [Stage 1804:==========================> (102 + 4) / 200]
> [Stage 1804:===========================> (107 + 4) / 200]
> [Stage 1804:============================> (111 + 4) / 200]
> [Stage 1804:==============================> (118 + 4) / 200]
> [Stage 1804:===============================> (122 + 4) / 200]
> [Stage 1804:=================================> (127 + 4) / 200]
> [Stage 1804:==================================> (132 + 4) / 200]
> [Stage 1804:===================================> (136 + 4) / 200]
> [Stage 1804:====================================> (141 + 4) / 200]
> [Stage 1804:======================================> (147 + 4) / 200]
> [Stage 1804:=======================================> (151 + 4) / 200]
> [Stage 1804:========================================> (156 + 4) / 200]
> [Stage 1804:==========================================> (162 + 4) / 200]
> [Stage 1804:===========================================> (166 + 4) / 200]
> [Stage 1804:============================================> (172 + 5) / 200]
> [Stage 1804:==============================================> (177 + 4) / 200]
> [Stage 1804:===============================================> (182 + 4) / 200]
> [Stage 1804:================================================> (186 + 4) / 200]
> [Stage 1804:==================================================> (193 + 4) / 200]
> [Stage 1804:===================================================>(198 + 2) / 200]
>
> .
> [Stage 1806:===> (12 + 4) / 200]
> [Stage 1806:====> (16 + 4) / 200]
> [Stage 1806:=====> (20 + 4) / 200]
> [Stage 1806:======> (24 + 4) / 200]
> [Stage 1806:=======> (29 + 4) / 200]
> [Stage 1806:=========> (35 + 4) / 200]
> [Stage 1806:==========> (39 + 4) / 200]
> [Stage 1806:============> (46 + 4) / 200]
> [Stage 1806:=============> (50 + 4) / 200]
> [Stage 1806:===============> (57 + 4) / 200]
> [Stage 1806:================> (62 + 4) / 200]
> [Stage 1806:=================> (66 + 4) / 200]
> [Stage 1806:==================> (71 + 4) / 200]
> [Stage 1806:===================> (75 + 4) / 200]
> [Stage 1806:====================> (79 + 4) / 200]
> [Stage 1806:======================> (84 + 4) / 200]
> [Stage 1806:=======================> (90 + 4) / 200]
> [Stage 1806:========================> (94 + 4) / 200]
> [Stage 1806:=========================> (98 + 4) / 200]
> [Stage 1806:==========================> (103 + 4) / 200]
> [Stage 1806:===========================> (107 + 4) / 200]
> [Stage 1806:============================> (110 + 4) / 200]
> [Stage 1806:=============================> (115 + 4) / 200]
> [Stage 1806:===============================> (120 + 4) / 200]
> [Stage 1806:================================> (124 + 4) / 200]
> [Stage 1806:=================================> (128 + 4) / 200]
> [Stage 1806:==================================> (132 + 4) / 200]
> [Stage 1806:===================================> (136 + 4) / 200]
> [Stage 1806:====================================> (140 + 4) / 200]
> [Stage 1806:=====================================> (145 + 4) / 200]
> [Stage 1806:======================================> (149 + 4) / 200]
> [Stage 1806:========================================> (155 + 4) / 200]
> [Stage 1806:=========================================> (159 + 4) / 200]
> [Stage 1806:==========================================> (165 + 4) / 200]
> [Stage 1806:============================================> (170 + 4) / 200]
> [Stage 1806:=============================================> (174 + 4) / 200]
> [Stage 1806:==============================================> (178 + 4) / 200]
> [Stage 1806:===============================================> (183 + 4) / 200]
> [Stage 1806:=================================================> (190 + 4) / 200]
> [Stage 1806:==================================================> (194 + 4) / 200]
> [Stage 1806:===================================================>(198 + 2) / 200]
>
> .
> [Stage 1809:==> (11 + 4) / 200]
> [Stage 1809:===> (15 + 4) / 200]
> [Stage 1809:=====> (19 + 4) / 200]
> [Stage 1809:=====> (22 + 4) / 200]
> [Stage 1809:=======> (27 + 4) / 200]
> [Stage 1809:=========> (34 + 4) / 200]
> [Stage 1809:==========> (38 + 4) / 200]
> [Stage 1809:===========> (43 + 4) / 200]
> [Stage 1809:============> (48 + 4) / 200]
> [Stage 1809:==============> (53 + 4) / 200]
> [Stage 1809:===============> (59 + 4) / 200]
> [Stage 1809:================> (63 + 4) / 200]
> [Stage 1809:==================> (68 + 4) / 200]
> [Stage 1809:===================> (72 + 4) / 200]
> [Stage 1809:====================> (77 + 4) / 200]
> [Stage 1809:=====================> (82 + 4) / 200]
> [Stage 1809:=======================> (87 + 4) / 200]
> [Stage 1809:========================> (92 + 4) / 200]
> [Stage 1809:=========================> (96 + 4) / 200]
> [Stage 1809:==========================> (100 + 4) / 200]
> [Stage 1809:===========================> (107 + 4) / 200]
> [Stage 1809:============================> (111 + 4) / 200]
> [Stage 1809:=============================> (115 + 4) / 200]
> [Stage 1809:===============================> (120 + 4) / 200]
> [Stage 1809:================================> (125 + 4) / 200]
> [Stage 1809:=================================> (130 + 4) / 200]
> [Stage 1809:===================================> (135 + 4) / 200]
> [Stage 1809:====================================> (139 + 4) / 200]
> [Stage 1809:=====================================> (146 + 4) / 200]
> [Stage 1809:=======================================> (151 + 4) / 200]
> [Stage 1809:========================================> (155 + 4) / 200]
> [Stage 1809:=========================================> (159 + 4) / 200]
> [Stage 1809:==========================================> (163 + 4) / 200]
> [Stage 1809:===========================================> (167 + 4) / 200]
> [Stage 1809:============================================> (171 + 4) / 200]
> [Stage 1809:==============================================> (177 + 4) / 200]
> [Stage 1809:===============================================> (181 + 4) / 200]
> [Stage 1809:================================================> (185 + 4) / 200]
> [Stage 1809:=================================================> (190 + 4) / 200]
> [Stage 1809:==================================================> (194 + 4) / 200]
> [Stage 1809:===================================================>(199 + 1) / 200]
>
> .
> [Stage 1812:==> (9 + 4) / 200]
> [Stage 1812:===> (15 + 4) / 200]
> [Stage 1812:=====> (19 + 4) / 200]
> [Stage 1812:=====> (21 + 4) / 200]
> [Stage 1812:=======> (27 + 4) / 200]
> [Stage 1812:========> (32 + 4) / 200]
> [Stage 1812:=========> (36 + 4) / 200]
> [Stage 1812:==========> (40 + 4) / 200]
> [Stage 1812:===========> (44 + 4) / 200]
> [Stage 1812:============> (48 + 4) / 200]
> [Stage 1812:==============> (55 + 4) / 200]
> [Stage 1812:===============> (60 + 4) / 200]
> [Stage 1812:================> (64 + 4) / 200]
> [Stage 1812:==================> (69 + 4) / 200]
> [Stage 1812:===================> (73 + 4) / 200]
> [Stage 1812:=====================> (80 + 4) / 200]
> [Stage 1812:=====================> (83 + 4) / 200]
> [Stage 1812:=======================> (87 + 4) / 200]
> [Stage 1812:========================> (91 + 4) / 200]
> [Stage 1812:=========================> (96 + 4) / 200]
> [Stage 1812:==========================> (101 + 4) / 200]
> [Stage 1812:===========================> (107 + 4) / 200]
> [Stage 1812:============================> (111 + 4) / 200]
> [Stage 1812:=============================> (115 + 4) / 200]
> [Stage 1812:==============================> (119 + 4) / 200]
> [Stage 1812:================================> (124 + 4) / 200]
> [Stage 1812:=================================> (129 + 4) / 200]
> [Stage 1812:==================================> (134 + 4) / 200]
> [Stage 1812:===================================> (138 + 4) / 200]
> [Stage 1812:====================================> (142 + 4) / 200]
> [Stage 1812:======================================> (147 + 4) / 200]
> [Stage 1812:=======================================> (151 + 4) / 200]
> [Stage 1812:========================================> (154 + 4) / 200]
> [Stage 1812:=========================================> (158 + 4) / 200]
> [Stage 1812:==========================================> (162 + 4) / 200]
> [Stage 1812:===========================================> (166 + 4) / 200]
> [Stage 1812:============================================> (171 + 4) / 200]
> [Stage 1812:=============================================> (175 + 4) / 200]
> [Stage 1812:==============================================> (180 + 4) / 200]
> [Stage 1812:================================================> (185 + 4) / 200]
> [Stage 1812:=================================================> (189 + 5) / 200]
> [Stage 1812:===================================================>(197 + 3) / 200]
>
> ........................S.
> tests RDD function take() : ................
> the textFile() function : .............
> functions in utils.R : ......................................
> Windows-specific tests : S
> 1. Failure (at test_sparkSQL.R#1300): date functions on a DataFrame ------------
> collect(select(df2, minute(df2$b)))[, 1] not equal to c(34, 24)
> 2/2 mismatches (average diff: 30).
> First 2:
> pos x y diff
> 1 4 34 -30
> 2 54 24 30
> Error: Test failures
> Execution halted
> USING R_HOME =
> Using Scala 2.11
> ~/workspace/Sparkv2.0.1/spark/R ~/workspace/Sparkv2.0.1/spark/R
> USING R_HOME =
> * installing *source* package 'SparkR' ...
> ** R
> ** inst
> ** preparing package for lazy loading
> Creating a new generic function for 'as.data.frame' in package 'SparkR'
> Creating a new generic function for 'colnames' in package 'SparkR'
> Creating a new generic function for 'colnames<-' in package 'SparkR'
> Creating a new generic function for 'cov' in package 'SparkR'
> Creating a new generic function for 'drop' in package 'SparkR'
> Creating a new generic function for 'na.omit' in package 'SparkR'
> Creating a new generic function for 'filter' in package 'SparkR'
> Creating a new generic function for 'intersect' in package 'SparkR'
> Creating a new generic function for 'sample' in package 'SparkR'
> Creating a new generic function for 'transform' in package 'SparkR'
> Creating a new generic function for 'subset' in package 'SparkR'
> Creating a new generic function for 'summary' in package 'SparkR'
> Creating a new generic function for 'union' in package 'SparkR'
> Creating a new generic function for 'lag' in package 'SparkR'
> Creating a new generic function for 'rank' in package 'SparkR'
> Creating a new generic function for 'sd' in package 'SparkR'
> Creating a new generic function for 'var' in package 'SparkR'
> Creating a new generic function for 'window' in package 'SparkR'
> Creating a new generic function for 'predict' in package 'SparkR'
> Creating a new generic function for 'rbind' in package 'SparkR'
> Creating a generic function for 'alias' from package 'stats' in package 'SparkR'
> Creating a generic function for 'substr' from package 'base' in package 'SparkR'
> Creating a generic function for '%in%' from package 'base' in package 'SparkR'
> Creating a generic function for 'mean' from package 'base' in package 'SparkR'
> Creating a generic function for 'lapply' from package 'base' in package 'SparkR'
> Creating a generic function for 'Filter' from package 'base' in package 'SparkR'
> Creating a generic function for 'unique' from package 'base' in package 'SparkR'
> Creating a generic function for 'nrow' from package 'base' in package 'SparkR'
> Creating a generic function for 'ncol' from package 'base' in package 'SparkR'
> Creating a generic function for 'head' from package 'utils' in package 'SparkR'
> Creating a generic function for 'factorial' from package 'base' in package 'SparkR'
> Creating a generic function for 'atan2' from package 'base' in package 'SparkR'
> Creating a generic function for 'ifelse' from package 'base' in package 'SparkR'
> ** help
> No man pages found in package 'SparkR'
> *** installing help indices
> ** building package indices
> ** installing vignettes
> ** testing if installed package can be loaded
> * DONE (SparkR)
> Picked up _JAVA_OPTIONS: -Xms512m -Xmx1024m
> ~/workspace/Sparkv2.0.1/spark/R/pkg/html ~/workspace/Sparkv2.0.1/spark/R ~/workspace/Sparkv2.0.1/spark/R
> Loading required package: methods
> Attaching package: 'SparkR'
> The following objects are masked from 'package:stats':
> cov, filter, lag, na.omit, predict, sd, var, window
> The following objects are masked from 'package:base':
> as.data.frame, colnames, colnames<-, drop, intersect, rank, rbind,
> sample, subset, summary, transform, union
> ~/workspace/Sparkv2.0.1/spark/R ~/workspace/Sparkv2.0.1/spark/R
> processing file: sparkr-vignettes.Rmd
> |
> | | 0%
> |
> |. | 1%
> inline R code fragments
> |
> |. | 2%
> label: unnamed-chunk-1 (with options)
> List of 1
> $ message: logi FALSE
> Loading required package: methods
> Attaching package: 'SparkR'
> The following objects are masked from 'package:stats':
> cov, filter, lag, na.omit, predict, sd, var, window
> The following objects are masked from 'package:base':
> as.data.frame, colnames, colnames<-, drop, intersect, rank,
> rbind, sample, subset, summary, transform, union
> |
> |.. | 3%
> ordinary text without R code
> |
> |.. | 4%
> label: unnamed-chunk-2 (with options)
> List of 1
> $ message: logi FALSE
> Spark package found in SPARK_HOME: /var/lib/jenkins/workspace/Sparkv2.0.1/spark
> Picked up _JAVA_OPTIONS: -Xms512m -Xmx1024m
> Picked up _JAVA_OPTIONS: -Xms512m -Xmx1024m
> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel).
> 17/01/19 17:46:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> 17/01/19 17:46:54 WARN Utils: Your hostname, hjsitiiti858 resolves to a loopback address: 127.0.0.1; using 10.51.237.18 instead (on interface eth0)
> 17/01/19 17:46:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
> 17/01/19 17:46:56 WARN SQLUtils: SparkR: enableHiveSupport is requested for SparkSession but Spark is not built with Hive; falling back to without Hive support.
> |
> |... | 5%
> ordinary text without R code
> |
> |.... | 6%
> label: unnamed-chunk-3
> ordinary text without R code
> |
> |..... | 7%
> label: unnamed-chunk-4
> |
> |..... | 8%
> ordinary text without R code
> |
> |...... | 9%
> label: unnamed-chunk-5
> |
> |....... | 10%
> ordinary text without R code
> |
> |....... | 11%
> label: unnamed-chunk-6
> [Stage 6:=======================> (85 + 5) / 199]
> [Stage 6:=================================> (121 + 4) / 199]
> [Stage 6:===========================================> (156 + 4) / 199]
> [Stage 6:===================================================> (188 + 4) / 199]
>
> |
> |........ | 12%
> ordinary text without R code
> |
> |........ | 13%
> label: unnamed-chunk-7
> [Stage 8:=============================> (106 + 4) / 200]
> [Stage 8:==========================================> (153 + 5) / 200]
> [Stage 8:====================================================> (191 + 6) / 200]
>
> |
> |......... | 14%
> ordinary text without R code
> |
> |.......... | 15%
> label: unnamed-chunk-8
> 17/01/19 17:47:10 WARN WeightedLeastSquares: regParam is zero, which might cause numerical instability and overfitting.
> 17/01/19 17:47:10 WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
> 17/01/19 17:47:10 WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS
> 17/01/19 17:47:10 WARN LAPACK: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK
> 17/01/19 17:47:10 WARN LAPACK: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK
> |
> |.......... | 16%
> ordinary text without R code
> |
> |........... | 17%
> label: unnamed-chunk-9
> |
> |........... | 18%
> ordinary text without R code
> |
> |............ | 19%
> label: unnamed-chunk-10 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |............. | 19%
> ordinary text without R code
> |
> |............. | 20%
> label: unnamed-chunk-11 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |.............. | 21%
> ordinary text without R code
> |
> |.............. | 22%
> label: unnamed-chunk-12 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |............... | 23%
> ordinary text without R code
> |
> |................ | 24%
> label: unnamed-chunk-13 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |................ | 25%
> ordinary text without R code
> |
> |................. | 26%
> label: unnamed-chunk-14 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |................. | 27%
> ordinary text without R code
> |
> |.................. | 28%
> label: unnamed-chunk-15 (with options)
> List of 2
> $ echo: logi FALSE
> $ tidy: logi TRUE
> |
> |................... | 29%
> ordinary text without R code
> |
> |................... | 30%
> label: unnamed-chunk-16 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |.................... | 31%
> ordinary text without R code
> label: unnamed-chunk-17 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |..................... | 32%
> ordinary text without R code
> |
> |...................... | 33%
> label: unnamed-chunk-18
> |
> |...................... | 34%
> ordinary text without R code
> |
> |....................... | 35%
> label: unnamed-chunk-19 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |....................... | 36%
> ordinary text without R code
> |
> |........................ | 37%
> label: unnamed-chunk-20 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |......................... | 38%
> ordinary text without R code
> |
> |......................... | 39%
> label: unnamed-chunk-21
> |
> |.......................... | 40%
> ordinary text without R code
> |
> |.......................... | 41%
> label: unnamed-chunk-22
> |
> |........................... | 42%
> ordinary text without R code
> |
> |............................ | 43%
> label: unnamed-chunk-23
> |
> |............................ | 44%
> ordinary text without R code
> |
> |............................. | 44%
> label: unnamed-chunk-24
> |
> |............................. | 45%
> ordinary text without R code
> |
> |.............................. | 46%
> label: unnamed-chunk-25 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |............................... | 47%
> ordinary text without R code
> |
> |............................... | 48%
> label: unnamed-chunk-26 (with options)
> List of 1
> $ eval: logi FALSE
> |
> |................................ | 49%
> ordinary text without R code
> |
> |................................ | 50%
> label: unnamed-chunk-27
> |
> |................................. | 51%
> ordinary text without R code
> |
> |.................................. | 52%
> label: unnamed-chunk-28
> |
> |.................................. | 53%
> ordinary text without R code
> |
> |................................... | 54%
> label: unnamed-chunk-29
> |
> |.................................... | 55%
> ordinary text without R code
> |
> |.................................... | 56%
> label: unnamed-chunk-30
> |
> |..................................... | 56%
> ordinary text without R code
> |
> |..................................... | 57%
> label: unnamed-chunk-31
> [Stage 37:====================================================> (195 + 4) / 199]
>
> |
> |...................................... | 58%
> ordinary text without R code
> |
> |....................................... | 59%
> label: unnamed-chunk-32
> |
> |....................................... | 60%
> ordinary text without R code
> |
> |........................................ | 61%
> label: unnamed-chunk-33
> [Stage 42:=====================================> (139 + 4) / 199]
> [Stage 42:=====================================================>(196 + 3) / 199]
>
> |
> |........................................ | 62%
> ordinary text without R code
> |
> |......................................... | 63%
> label: unnamed-chunk-34
> |
> |.......................................... | 64%
> ordinary text without R code
> |
> |.......................................... | 65%
> label: unnamed-chunk-35
> |
> |........................................... | 66%
> ordinary text without R code
> |
> |........................................... | 67%
> label: unnamed-chunk-36
> 17/01/19 17:47:19 WARN Utils: Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.debug.maxToStringFields' in SparkEnv.conf.
> [Stage 46:======> (23 + 4) / 200]
> [Stage 46:========> (32 + 4) / 200]
> [Stage 46:===========> (42 + 4) / 200]
> [Stage 46:=============> (48 + 4) / 200]
> [Stage 46:===============> (55 + 4) / 200]
> [Stage 46:=================> (62 + 4) / 200]
> [Stage 46:===================> (71 + 4) / 200]
> [Stage 46:=====================> (79 + 4) / 200]
> [Stage 46:========================> (89 + 4) / 200]
> [Stage 46:==========================> (97 + 4) / 200]
> [Stage 46:============================> (107 + 4) / 200]
> [Stage 46:===============================> (115 + 4) / 200]
> [Stage 46:=================================> (125 + 4) / 200]
> [Stage 46:====================================> (136 + 4) / 200]
> [Stage 46:======================================> (144 + 4) / 200]
> [Stage 46:=========================================> (154 + 4) / 200]
> [Stage 46:============================================> (163 + 4) / 200]
> [Stage 46:==============================================> (173 + 4) / 200]
> [Stage 46:================================================> (181 + 4) / 200]
> [Stage 46:===================================================> (191 + 4) / 200]
> [Stage 46:=====================================================>(198 + 2) / 200]
>
> |
> |............................................ | 68%
> ordinary text without R code
> |
> |............................................. | 69%
> label: unnamed-chunk-37
> [Stage 48:======> (25 + 4) / 200]
> [Stage 48:=========> (33 + 4) / 200]
> [Stage 48:============> (44 + 4) / 200]
> [Stage 48:===============> (55 + 4) / 200]
> [Stage 48:=================> (62 + 4) / 200]
> [Stage 48:===================> (70 + 4) / 200]
> [Stage 48:=====================> (79 + 4) / 200]
> [Stage 48:========================> (88 + 4) / 200]
> [Stage 48:==========================> (97 + 4) / 200]
> [Stage 48:=============================> (108 + 4) / 200]
> [Stage 48:===============================> (117 + 4) / 200]
> [Stage 48:==================================> (128 + 4) / 200]
> [Stage 48:====================================> (137 + 4) / 200]
> [Stage 48:=======================================> (146 + 4) / 200]
> [Stage 48:=========================================> (154 + 4) / 200]
> [Stage 48:============================================> (164 + 4) / 200]
> [Stage 48:==============================================> (174 + 4) / 200]
> [Stage 48:=================================================> (185 + 4) / 200]
> [Stage 48:====================================================> (193 + 4) / 200]
>
> ordinary text without R code
> |
> |.............................................. | 70%
> label: unnamed-chunk-38
> |
> |.............................................. | 71%
> ordinary text without R code
> |
> |............................................... | 72%
> label: unnamed-chunk-39
> 17/01/19 17:47:29 ERROR Executor: Exception in task 3.0 in stage 49.0 (TID 1243)
> org.apache.spark.SparkException: R computation failed with
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> at org.apache.spark.api.r.RRunner.compute(RRunner.scala:108)
> at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 17/01/19 17:47:29 ERROR Executor: Exception in task 1.0 in stage 49.0 (TID 1241)
> org.apache.spark.SparkException: R computation failed with
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> at org.apache.spark.api.r.RRunner.compute(RRunner.scala:108)
> at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 17/01/19 17:47:29 ERROR Executor: Exception in task 0.0 in stage 49.0 (TID 1240)
> org.apache.spark.SparkException: R computation failed with
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> at org.apache.spark.api.r.RRunner.compute(RRunner.scala:108)
> at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 17/01/19 17:47:29 ERROR Executor: Exception in task 2.0 in stage 49.0 (TID 1242)
> org.apache.spark.SparkException: R computation failed with
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> at org.apache.spark.api.r.RRunner.compute(RRunner.scala:108)
> at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 17/01/19 17:47:29 WARN TaskSetManager: Lost task 3.0 in stage 49.0 (TID 1243, localhost): org.apache.spark.SparkException: R computation failed with
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> at org.apache.spark.api.r.RRunner.compute(RRunner.scala:108)
> at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 17/01/19 17:47:29 ERROR TaskSetManager: Task 3 in stage 49.0 failed 1 times; aborting job
> 17/01/19 17:47:29 ERROR RRunner: R Writer thread got an exception
> org.apache.spark.TaskKilledException
> at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
> at org.apache.spark.api.r.RRunner$$anon$2.run(RRunner.scala:158)
> 17/01/19 17:47:29 ERROR RBackendHandler: collect on 309 failed
> Quitting from lines 401-402 (sparkr-vignettes.Rmd)
> Error in invokeJava(isStatic = FALSE, objId$id, methodName, ...) :
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 in stage 49.0 failed 1 times, most recent failure: Lost task 3.0 in stage 49.0 (TID 1243, localhost): org.apache.spark.SparkException: R computation failed with
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> at org.apache.spark.api.r.RRunner.compute(RRunner.scala:108)
> at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool
> Calls: render ... collectRDD -> collectRDD -> .local -> callJMethod -> invokeJava
> Execution halted
> 17/01/19 17:47:29 ERROR Executor: Exception in task 4.0 in stage 49.0 (TID 1244)
> org.apache.spark.SparkException: R computation failed with
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error in readBin(con, raw(), stringLen, endian = "big") :
> invalid 'n' argument
> at org.apache.spark.api.r.RRunner.compute(RRunner.scala:108)
> at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> 17/01/19 17:47:29 WARN TaskSetManager: Lost task 4.0 in stage 49.0 (TID 1244, localhost): org.apache.spark.SparkException: R computation failed with
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error : requireNamespace("e1071", quietly = TRUE) is not TRUE
> Error in readBin(con, raw(), stringLen, endian = "big") :
> invalid 'n' argument
> at org.apache.spark.api.r.RRunner.compute(RRunner.scala:108)
> at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:49)
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
> at org.apache.spark.scheduler.Task.run(Task.scala:86)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Picked up _JAVA_OPTIONS: -Xms512m -Xmx1024m
> Picked up _JAVA_OPTIONS: -Xms512m -Xmx1024m
> Loading required package: methods
> Attaching package: 'SparkR'
> The following object is masked from 'package:testthat':
> describe
> The following objects are masked from 'package:stats':
> cov, filter, lag, na.omit, predict, sd, var, window
> The following objects are masked from 'package:base':
> as.data.frame, colnames, colnames<-, drop, intersect, rank, rbind,
> sample, subset, summary, transform, union
> functions on binary files : Spark package found in SPARK_HOME: /var/lib/jenkins/workspace/Sparkv2.0.1/spark
> ....
> binary functions : ...........
> broadcast variables : ..
> functions in client.R : .....
> test functions in sparkR.R : .....Re-using existing Spark Context. Call sparkR.session.stop() or restart R to create a new Spark Context
> ...............
> include R packages : Spark package found in SPARK_HOME: /var/lib/jenkins/workspace/Sparkv2.0.1/spark
> JVM API : ..
> MLlib functions : Spark package found in SPARK_HOME: /var/lib/jenkins/workspace/Sparkv2.0.1/spark
> .........................SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
> SLF4J: Defaulting to no-operation (NOP) logger implementation
> SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
> .Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:53 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:54 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 65,622
> Jan 19, 2017 5:40:54 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 70B for [label] BINARY: 1 values, 21B raw, 23B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:54 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 87B for [terms, list, element, list, element] BINARY: 2 values, 42B raw, 43B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:54 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 30B for [hasIntercept] BOOLEAN: 1 values, 1B raw, 3B comp, 1 pages, encodings: [PLAIN, BIT_PACKED]
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 49
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 90B for [labels, list, element] BINARY: 3 values, 50B raw, 50B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 92
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 61B for [vectorCol] BINARY: 1 values, 18B raw, 20B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 126B for [prefixesToRewrite, key_value, key] BINARY: 2 values, 61B raw, 61B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 58B for [prefixesToRewrite, key_value, value] BINARY: 2 values, 15B raw, 17B comp, 1 pages, encodings: [PLAIN_DICTIONARY, RLE], dic { 1 entries, 12B raw, 1B comp}
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 54
> Jan 19, 2017 5:40:55 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 122B for [columnsToPrune, list, element] BINARY: 2 values, 59B raw, 59B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 56
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 51B for [intercept] DOUBLE: 1 values, 8B raw, 10B comp, 1 pages, encodings: [PLAIN, BIT_PACKED]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 45B for [coefficients, type] INT32: 1 values, 10B raw, 12B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 30B for [coefficients, size] INT32: 1 values, 7B raw, 9B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 36B for [coefficients, indices, list, element] INT32: 1 values, 13B raw, 15B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 79B for [coefficients, values, list, element] DOUBLE: 3 values, 37B raw, 38B comp, 1 pages, encodings: [PLAIN, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.codec.CodecConfig: Compression: SNAPPY
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet block size to 134217728
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet page size to 1048576
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Parquet dictionary page size to 1048576
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Dictionary is on
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Validation is off
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ParquetOutputFormat: Writer version is: PARQUET_1_0
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.InternalParquetRecordWriter: Flushing mem columnStore to file. allocated memory: 65,622
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 70B for [label] BINARY: 1 values, 21B raw, 23B comp, 1 pages, encodings: [PLAIN, BIT_PACKED, RLE]
> Jan 19, 2017 5:40:56 PM INFO: org.apache.parquet.hadoop.ColumnChunkPageWriteStore: written 87B for [terms, list, element, list, element] BINARY: 2 values, 42B raw, 43B comp, 1 ................................................
> [Stage 419:===================================================> (194 + 4) / 200]
>
> .........................
> parallelize() and collect() : .............................
> basic RDD functions : ...........................................................................................................................................................................................................................................................................................................................................................................................................................................
> [Stage 906:===> (7 + 4) / 100]
> [Stage 906:=====> (11 + 4) / 100]
> [Stage 906:=======> (14 + 4) / 100]
> [Stage 906:========> (16 + 4) / 100]
> [Stage 906:==========> (19 + 4) / 100]
> [Stage 906:============> (23 + 4) / 100]
> [Stage 906:==============> (27 + 4) / 100]
> [Stage 906:================> (31 + 4) / 100]
> [Stage 906:=================> (32 + 4) / 100]
> [Stage 906:==================> (35 + 4) / 100]
> [Stage 906:=====================> (39 + 4) / 100]
> [Stage 906:======================> (41 + 4) / 100]
> [Stage 906:=======================> (44 + 4) / 100]
> [Stage 906:=========================> (47 + 4) / 100]
> [Stage 906:===========================> (51 + 4) / 100]
> [Stage 906:=============================> (54 + 4) / 100]
> [Stage 906:=============================> (55 + 4) / 100]
> [Stage 906:===============================> (59 + 4) / 100]
> [Stage 906:================================> (61 + 4) / 100]
> [Stage 906:==================================> (64 + 4) / 100]
> [Stage 906:====================================> (67 + 4) / 100]
> [Stage 906:=====================================> (69 + 4) / 100]
> [Stage 906:======================================> (72 + 4) / 100]
> [Stage 906:=======================================> (74 + 4) / 100]
> [Stage 906:=========================================> (76 + 4) / 100]
> [Stage 906:===========================================> (80 + 4) / 100]
> [Stage 906:===========================================> (81 + 4) / 100]
> [Stage 906:=============================================> (84 + 4) / 100]
> [Stage 906:===============================================> (88 + 4) / 100]
> [Stage 906:=================================================> (91 + 4) / 100]
> [Stage 906:===================================================> (95 + 4) / 100]
> [Stage 906:====================================================> (97 + 3) / 100]
>
> .
> SerDe functionality : ...................
> partitionBy, groupByKey, reduceByKey etc. : ....................
> SparkSQL functions : .........................................................S.........................................................................................................................................................................................................
> [Stage 1148:============================================> (170 + 4) / 200]
>
> .......................................................................S....................................................................................1..................................................
> [Stage 1405:============================================> (170 + 4) / 200]
>
> ................................
> [Stage 1474:============================================> (173 + 4) / 200]
>
> ..................................................................................S..........................................................................................................................
> [Stage 1793:==> (9 + 4) / 200]
> [Stage 1793:===> (12 + 4) / 200]
> [Stage 1793:===> (14 + 4) / 200]
> [Stage 1793:====> (16 + 4) / 200]
> [Stage 1793:=====> (20 + 4) / 200]
> [Stage 1793:======> (23 + 4) / 200]
> [Stage 1793:======> (24 + 4) / 200]
> [Stage 1793:=======> (28 + 4) / 200]
> [Stage 1793:========> (31 + 4) / 200]
> [Stage 1793:========> (33 + 4) / 200]
> [Stage 1793:=========> (36 + 4) / 200]
> [Stage 1793:==========> (40 + 4) / 200]
> [Stage 1793:===========> (42 + 4) / 200]
> [Stage 1793:============> (46 + 4) / 200]
> [Stage 1793:============> (49 + 4) / 200]
> [Stage 1793:=============> (51 + 4) / 200]
> [Stage 1793:==============> (55 + 4) / 200]
> [Stage 1793:===============> (58 + 4) / 200]
> [Stage 1793:================> (61 + 4) / 200]
> [Stage 1793:================> (64 + 4) / 200]
> [Stage 1793:=================> (67 + 4) / 200]
> [Stage 1793:==================> (69 + 4) / 200]
> [Stage 1793:===================> (73 + 4) / 200]
> [Stage 1793:====================> (76 + 4) / 200]
> [Stage 1793:====================> (79 + 4) / 200]
> [Stage 1793:=====================> (83 + 4) / 200]
> [Stage 1793:======================> (86 + 4) / 200]
> [Stage 1793:=======================> (88 + 4) / 200]
> [Stage 1793:========================> (91 + 4) / 200]
> [Stage 1793:========================> (94 + 4) / 200]
> [Stage 1793:=========================> (97 + 4) / 200]
> [Stage 1793:==========================> (99 + 4) / 200]
> [Stage 1793:==========================> (103 + 4) / 200]
> [Stage 1793:===========================> (105 + 4) / 200]
> [Stage 1793:============================> (108 + 4) / 200]
> [Stage 1793:============================> (110 + 4) / 200]
> [Stage 1793:=============================> (114 + 4) / 200]
> [Stage 1793:==============================> (117 + 4) / 200]
> [Stage 1793:==============================> (119 + 4) / 200]
> [Stage 1793:===============================> (123 + 4) / 200]
> [Stage 1793:=================================> (127 + 4) / 200]
> [Stage 1793:=================================> (129 + 4) / 200]
> [Stage 1793:==================================> (131 + 4) / 200]
> [Stage 1793:==================================> (134 + 4) / 200]
> [Stage 1793:===================================> (138 + 4) / 200]
> [Stage 1793:====================================> (141 + 4) / 200]
> [Stage 1793:=====================================> (143 + 4) / 200]
> [Stage 1793:=====================================> (146 + 4) / 200]
> [Stage 1793:=======================================> (150 + 4) / 200]
> [Stage 1793:========================================> (154 + 4) / 200]
> [Stage 1793:=========================================> (158 + 4) / 200]
> [Stage 1793:=========================================> (159 + 4) / 200]
> [Stage 1793:==========================================> (162 + 4) / 200]
> [Stage 1793:===========================================> (166 + 4) / 200]
> [Stage 1793:============================================> (170 + 4) / 200]
> [Stage 1793:============================================> (172 + 4) / 200]
> [Stage 1793:=============================================> (174 + 4) / 200]
> [Stage 1793:==============================================> (178 + 4) / 200]
> [Stage 1793:==============================================> (180 + 4) / 200]
> [Stage 1793:===============================================> (182 + 4) / 200]
> [Stage 1793:================================================> (186 + 4) / 200]
> [Stage 1793:=================================================> (189 + 4) / 200]
> [Stage 1793:=================================================> (191 + 4) / 200]
> [Stage 1793:==================================================> (194 + 4) / 200]
> [Stage 1793:===================================================>(198 + 2) / 200]
>
> .
> [Stage 1798:=> (7 + 4) / 200]
> [Stage 1798:==> (9 + 4) / 200]
> [Stage 1798:==> (11 + 4) / 200]
> [Stage 1798:===> (15 + 4) / 200]
> [Stage 1798:====> (18 + 4) / 200]
> [Stage 1798:=====> (20 + 4) / 200]
> [Stage 1798:======> (23 + 4) / 200]
> [Stage 1798:=======> (27 + 4) / 200]
> [Stage 1798:=======> (28 + 4) / 200]
> [Stage 1798:========> (31 + 4) / 200]
> [Stage 1798:=========> (35 + 4) / 200]
> [Stage 1798:=========> (37 + 4) / 200]
> [Stage 1798:==========> (40 + 4) / 200]
> [Stage 1798:===========> (43 + 4) / 200]
> [Stage 1798:============> (47 + 4) / 200]
> [Stage 1798:============> (49 + 4) / 200]
> [Stage 1798:=============> (52 + 4) / 200]
> [Stage 1798:==============> (55 + 4) / 200]
> [Stage 1798:===============> (59 + 4) / 200]
> [Stage 1798:================> (61 + 4) / 200]
> [Stage 1798:================> (64 + 4) / 200]
> [Stage 1798:=================> (67 + 4) / 200]
> [Stage 1798:==================> (70 + 4) / 200]
> [Stage 1798:===================> (74 + 4) / 200]
> [Stage 1798:====================> (76 + 4) / 200]
> [Stage 1798:=====================> (80 + 4) / 200]
> [Stage 1798:=====================> (83 + 4) / 200]
> [Stage 1798:======================> (84 + 4) / 200]
> [Stage 1798:=======================> (88 + 4) / 200]
> [Stage 1798:========================> (91 + 4) / 200]
> [Stage 1798:========================> (92 + 4) / 200]
> [Stage 1798:=========================> (96 + 4) / 200]
> [Stage 1798:==========================> (99 + 4) / 200]
> [Stage 1798:==========================> (101 + 4) / 200]
> [Stage 1798:===========================> (104 + 4) / 200]
> [Stage 1798:===========================> (107 + 4) / 200]
> [Stage 1798:============================> (111 + 4) / 200]
> [Stage 1798:=============================> (115 + 4) / 200]
> [Stage 1798:==============================> (116 + 4) / 200]
> [Stage 1798:===============================> (120 + 4) / 200]
> [Stage 1798:===============================> (123 + 4) / 200]
> [Stage 1798:=================================> (127 + 4) / 200]
> [Stage 1798:=================================> (129 + 4) / 200]
> [Stage 1798:==================================> (133 + 4) / 200]
> [Stage 1798:===================================> (136 + 4) / 200]
> [Stage 1798:===================================> (138 + 4) / 200]
> [Stage 1798:====================================> (139 + 4) / 200]
> [Stage 1798:====================================> (142 + 4) / 200]
> [Stage 1798:=====================================> (146 + 4) / 200]
> [Stage 1798:=======================================> (150 + 4) / 200]
> [Stage 1798:=======================================> (153 + 4) / 200]
> [Stage 1798:========================================> (154 + 4) / 200]
> [Stage 1798:=========================================> (158 + 4) / 200]
> [Stage 1798:==========================================> (162 + 4) / 200]
> [Stage 1798:==========================================> (164 + 4) / 200]
> [Stage 1798:===========================================> (168 + 4) / 200]
> [Stage 1798:============================================> (170 + 4) / 200]
> [Stage 1798:=============================================> (174 + 4) / 200]
> [Stage 1798:==============================================> (177 + 4) / 200]
> [Stage 1798:==============================================> (180 + 4) / 200]
> [Stage 1798:===============================================> (182 + 4) / 200]
> [Stage 1798:===============================================> (184 + 4) / 200]
> [Stage 1798:================================================> (187 + 4) / 200]
> [Stage 1798:=================================================> (192 + 4) / 200]
> [Stage 1798:==================================================> (193 + 4) / 200]
> [Stage 1798:===================================================>(197 + 3) / 200]
>
> .
> [Stage 1800:=> (7 + 4) / 200]
> [Stage 1800:==> (10 + 4) / 200]
> [Stage 1800:===> (14 + 4) / 200]
> [Stage 1800:====> (16 + 4) / 200]
> [Stage 1800:=====> (20 + 4) / 200]
> [Stage 1800:=====> (22 + 4) / 200]
> [Stage 1800:======> (26 + 4) / 200]
> [Stage 1800:=======> (30 + 4) / 200]
> [Stage 1800:=========> (34 + 4) / 200]
> [Stage 1800:==========> (39 + 4) / 200]
> [Stage 1800:===========> (44 + 4) / 200]
> [Stage 1800:============> (47 + 4) / 200]
> [Stage 1800:==============> (53 + 4) / 200]
> [Stage 1800:==============> (56 + 4) / 200]
> [Stage 1800:===============> (60 + 4) / 200]
> [Stage 1800:================> (64 + 4) / 200]
> [Stage 1800:==================> (68 + 4) / 200]
> [Stage 1800:==================> (71 + 4) / 200]
> [Stage 1800:===================> (74 + 4) / 200]
> [Stage 1800:====================> (78 + 4) / 200]
> [Stage 1800:=====================> (83 + 4) / 200]
> [Stage 1800:=======================> (87 + 4) / 200]
> [Stage 1800:========================> (91 + 4) / 200]
> [Stage 1800:=========================> (97 + 4) / 200]
> [Stage 1800:==========================> (101 + 4) / 200]
> [Stage 1800:===========================> (106 + 4) / 200]
> [Stage 1800:============================> (111 + 4) / 200]
> [Stage 1800:=============================> (114 + 4) / 200]
> [Stage 1800:==============================> (118 + 4) / 200]
> [Stage 1800:===============================> (121 + 4) / 200]
> [Stage 1800:================================> (125 + 4) / 200]
> [Stage 1800:=================================> (128 + 4) / 200]
> [Stage 1800:==================================> (132 + 4) / 200]
> [Stage 1800:===================================> (136 + 4) / 200]
> [Stage 1800:====================================> (139 + 4) / 200]
> [Stage 1800:====================================> (141 + 4) / 200]
> [Stage 1800:=====================================> (144 + 4) / 200]
> [Stage 1800:======================================> (148 + 4) / 200]
> [Stage 1800:=======================================> (150 + 4) / 200]
> [Stage 1800:=======================================> (153 + 4) / 200]
> [Stage 1800:=========================================> (159 + 4) / 200]
> [Stage 1800:==========================================> (163 + 4) / 200]
> [Stage 1800:===========================================> (167 + 4) / 200]
> [Stage 1800:============================================> (172 + 4) / 200]
> [Stage 1800:=============================================> (176 + 4) / 200]
> [Stage 1800:==============================================> (179 + 4) / 200]
> [Stage 1800:===============================================> (184 + 4) / 200]
> [Stage 1800:=================================================> (189 + 4) / 200]
> [Stage 1800:==================================================> (194 + 4) / 200]
> [Stage 1800:===================================================>(198 + 2) / 200]
>
> .
> [Stage 1802:===> (14 + 4) / 200]
> [Stage 1802:====> (18 + 4) / 200]
> [Stage 1802:======> (23 + 5) / 200]
> [Stage 1802:=======> (29 + 4) / 200]
> [Stage 1802:========> (33 + 4) / 200]
> [Stage 1802:=========> (37 + 4) / 200]
> [Stage 1802:===========> (42 + 4) / 200]
> [Stage 1802:============> (46 + 4) / 200]
> [Stage 1802:=============> (50 + 4) / 200]
> [Stage 1802:==============> (54 + 4) / 200]
> [Stage 1802:===============> (58 + 4) / 200]
> [Stage 1802:================> (62 + 4) / 200]
> [Stage 1802:==================> (68 + 4) / 200]
> [Stage 1802:===================> (72 + 4) / 200]
> [Stage 1802:====================> (78 + 4) / 200]
> [Stage 1802:======================> (84 + 4) / 200]
> [Stage 1802:=======================> (88 + 4) / 200]
> [Stage 1802:========================> (94 + 4) / 200]
> [Stage 1802:==========================> (99 + 4) / 200]
> [Stage 1802:==========================> (103 + 4) / 200]
> [Stage 1802:============================> (109 + 4) / 200]
> [Stage 1802:=============================> (113 + 4) / 200]
> [Stage 1802:==============================> (118 + 4) / 200]
> [Stage 1802:===============================> (122 + 4) / 200]
> [Stage 1802:================================> (126 + 4) / 200]
> [Stage 1802:==================================> (132 + 4) / 200]
> [Stage 1802:===================================> (136 + 4) / 200]
> [Stage 1802:====================================> (141 + 4) / 200]
> [Stage 1802:=====================================> (145 + 4) / 200]
> [Stage 1802:=======================================> (151 + 4) / 200]
> [Stage 1802:========================================> (155 + 4) / 200]
> [Stage 1802:=========================================> (161 + 4) / 200]
> [Stage 1802:===========================================> (166 + 4) / 200]
> [Stage 1802:============================================> (170 + 4) / 200]
> [Stage 1802:=============================================> (174 + 4) / 200]
> [Stage 1802:===============================================> (182 + 4) / 200]
> [Stage 1802:================================================> (186 + 4) / 200]
> [Stage 1802:=================================================> (190 + 4) / 200]
> [Stage 1802:==================================================> (194 + 4) / 200]
>
> .
> [Stage 1804:==> (11 + 5) / 200]
> [Stage 1804:===> (15 + 4) / 200]
> [Stage 1804:=====> (19 + 4) / 200]
> [Stage 1804:======> (23 + 4) / 200]
> [Stage 1804:=======> (27 + 4) / 200]
> [Stage 1804:========> (32 + 4) / 200]
> [Stage 1804:=========> (36 + 4) / 200]
> [Stage 1804:==========> (40 + 4) / 200]
> [Stage 1804:===========> (45 + 4) / 200]
> [Stage 1804:============> (49 + 4) / 200]
> [Stage 1804:==============> (53 + 4) / 200]
> [Stage 1804:===============> (58 + 4) / 200]
> [Stage 1804:================> (63 + 4) / 200]
> [Stage 1804:=================> (67 + 4) / 200]
> [Stage 1804:===================> (72 + 4) / 200]
> [Stage 1804:====================> (76 + 4) / 200]
> [Stage 1804:====================> (79 + 4) / 200]
> [Stage 1804:=====================> (83 + 4) / 200]
> [Stage 1804:=======================> (88 + 4) / 200]
> [Stage 1804:========================> (94 + 4) / 200]
> [Stage 1804:=========================> (97 + 4) / 200]
> [Stage 1804:==========================> (102 + 4) / 200]
> [Stage 1804:===========================> (107 + 4) / 200]
> [Stage 1804:============================> (111 + 4) / 200]
> [Stage 1804:==============================> (118 + 4) / 200]
> [Stage 1804:===============================> (122 + 4) / 200]
> [Stage 1804:=================================> (127 + 4) / 200]
> [Stage 1804:==================================> (132 + 4) / 200]
> [Stage 1804:===================================> (136 + 4) / 200]
> [Stage 1804:====================================> (141 + 4) / 200]
> [Stage 1804:======================================> (147 + 4) / 200]
> [Stage 1804:=======================================> (151 + 4) / 200]
> [Stage 1804:========================================> (156 + 4) / 200]
> [Stage 1804:==========================================> (162 + 4) / 200]
> [Stage 1804:===========================================> (166 + 4) / 200]
> [Stage 1804:============================================> (172 + 5) / 200]
> [Stage 1804:==============================================> (177 + 4) / 200]
> [Stage 1804:===============================================> (182 + 4) / 200]
> [Stage 1804:================================================> (186 + 4) / 200]
> [Stage 1804:==================================================> (193 + 4) / 200]
> [Stage 1804:===================================================>(198 + 2) / 200]
>
> .
> [Stage 1806:===> (12 + 4) / 200]
> [Stage 1806:====> (16 + 4) / 200]
> [Stage 1806:=====> (20 + 4) / 200]
> [Stage 1806:======> (24 + 4) / 200]
> [Stage 1806:=======> (29 + 4) / 200]
> [Stage 1806:=========> (35 + 4) / 200]
> [Stage 1806:==========> (39 + 4) / 200]
> [Stage 1806:============> (46 + 4) / 200]
> [Stage 1806:=============> (50 + 4) / 200]
> [Stage 1806:===============> (57 + 4) / 200]
> [Stage 1806:================> (62 + 4) / 200]
> [Stage 1806:=================> (66 + 4) / 200]
> [Stage 1806:==================> (71 + 4) / 200]
> [Stage 1806:===================> (75 + 4) / 200]
> [Stage 1806:====================> (79 + 4) / 200]
> [Stage 1806:======================> (84 + 4) / 200]
> [Stage 1806:=======================> (90 + 4) / 200]
> [Stage 1806:========================> (94 + 4) / 200]
> [Stage 1806:=========================> (98 + 4) / 200]
> [Stage 1806:==========================> (103 + 4) / 200]
> [Stage 1806:===========================> (107 + 4) / 200]
> [Stage 1806:============================> (110 + 4) / 200]
> [Stage 1806:=============================> (115 + 4) / 200]
> [Stage 1806:===============================> (120 + 4) / 200]
> [Stage 1806:================================> (124 + 4) / 200]
> [Stage 1806:=================================> (128 + 4) / 200]
> [Stage 1806:==================================> (132 + 4) / 200]
> [Stage 1806:===================================> (136 + 4) / 200]
> [Stage 1806:====================================> (140 + 4) / 200]
> [Stage 1806:=====================================> (145 + 4) / 200]
> [Stage 1806:======================================> (149 + 4) / 200]
> [Stage 1806:========================================> (155 + 4) / 200]
> [Stage 1806:=========================================> (159 + 4) / 200]
> [Stage 1806:==========================================> (165 + 4) / 200]
> [Stage 1806:============================================> (170 + 4) / 200]
> [Stage 1806:=============================================> (174 + 4) / 200]
> [Stage 1806:==============================================> (178 + 4) / 200]
> [Stage 1806:===============================================> (183 + 4) / 200]
> [Stage 1806:=================================================> (190 + 4) / 200]
> [Stage 1806:==================================================> (194 + 4) / 200]
> [Stage 1806:===================================================>(198 + 2) / 200]
>
> .
> [Stage 1809:==> (11 + 4) / 200]
> [Stage 1809:===> (15 + 4) / 200]
> [Stage 1809:=====> (19 + 4) / 200]
> [Stage 1809:=====> (22 + 4) / 200]
> [Stage 1809:=======> (27 + 4) / 200]
> [Stage 1809:=========> (34 + 4) / 200]
> [Stage 1809:==========> (38 + 4) / 200]
> [Stage 1809:===========> (43 + 4) / 200]
> [Stage 1809:============> (48 + 4) / 200]
> [Stage 1809:==============> (53 + 4) / 200]
> [Stage 1809:===============> (59 + 4) / 200]
> [Stage 1809:================> (63 + 4) / 200]
> [Stage 1809:==================> (68 + 4) / 200]
> [Stage 1809:===================> (72 + 4) / 200]
> [Stage 1809:====================> (77 + 4) / 200]
> [Stage 1809:=====================> (82 + 4) / 200]
> [Stage 1809:=======================> (87 + 4) / 200]
> [Stage 1809:========================> (92 + 4) / 200]
> [Stage 1809:=========================> (96 + 4) / 200]
> [Stage 1809:==========================> (100 + 4) / 200]
> [Stage 1809:===========================> (107 + 4) / 200]
> [Stage 1809:============================> (111 + 4) / 200]
> [Stage 1809:=============================> (115 + 4) / 200]
> [Stage 1809:===============================> (120 + 4) / 200]
> [Stage 1809:================================> (125 + 4) / 200]
> [Stage 1809:=================================> (130 + 4) / 200]
> [Stage 1809:===================================> (135 + 4) / 200]
> [Stage 1809:====================================> (139 + 4) / 200]
> [Stage 1809:=====================================> (146 + 4) / 200]
> [Stage 1809:=======================================> (151 + 4) / 200]
> [Stage 1809:========================================> (155 + 4) / 200]
> [Stage 1809:=========================================> (159 + 4) / 200]
> [Stage 1809:==========================================> (163 + 4) / 200]
> [Stage 1809:===========================================> (167 + 4) / 200]
> [Stage 1809:============================================> (171 + 4) / 200]
> [Stage 1809:==============================================> (177 + 4) / 200]
> [Stage 1809:===============================================> (181 + 4) / 200]
> [Stage 1809:================================================> (185 + 4) / 200]
> [Stage 1809:=================================================> (190 + 4) / 200]
> [Stage 1809:==================================================> (194 + 4) / 200]
> [Stage 1809:===================================================>(199 + 1) / 200]
>
> .
> [Stage 1812:==> (9 + 4) / 200]
> [Stage 1812:===> (15 + 4) / 200]
> [Stage 1812:=====> (19 + 4) / 200]
> [Stage 1812:=====> (21 + 4) / 200]
> [Stage 1812:=======> (27 + 4) / 200]
> [Stage 1812:========> (32 + 4) / 200]
> [Stage 1812:=========> (36 + 4) / 200]
> [Stage 1812:==========> (40 + 4) / 200]
> [Stage 1812:===========> (44 + 4) / 200]
> [Stage 1812:============> (48 + 4) / 200]
> [Stage 1812:==============> (55 + 4) / 200]
> [Stage 1812:===============> (60 + 4) / 200]
> [Stage 1812:================> (64 + 4) / 200]
> [Stage 1812:==================> (69 + 4) / 200]
> [Stage 1812:===================> (73 + 4) / 200]
> [Stage 1812:=====================> (80 + 4) / 200]
> [Stage 1812:=====================> (83 + 4) / 200]
> [Stage 1812:=======================> (87 + 4) / 200]
> [Stage 1812:========================> (91 + 4) / 200]
> [Stage 1812:=========================> (96 + 4) / 200]
> [Stage 1812:==========================> (101 + 4) / 200]
> [Stage 1812:===========================> (107 + 4) / 200]
> [Stage 1812:============================> (111 + 4) / 200]
> [Stage 1812:=============================> (115 + 4) / 200]
> [Stage 1812:==============================> (119 + 4) / 200]
> [Stage 1812:================================> (124 + 4) / 200]
> [Stage 1812:=================================> (129 + 4) / 200]
> [Stage 1812:==================================> (134 + 4) / 200]
> [Stage 1812:===================================> (138 + 4) / 200]
> [Stage 1812:====================================> (142 + 4) / 200]
> [Stage 1812:======================================> (147 + 4) / 200]
> [Stage 1812:=======================================> (151 + 4) / 200]
> [Stage 1812:========================================> (154 + 4) / 200]
> [Stage 1812:=========================================> (158 + 4) / 200]
> [Stage 1812:==========================================> (162 + 4) / 200]
> [Stage 1812:===========================================> (166 + 4) / 200]
> [Stage 1812:============================================> (171 + 4) / 200]
> [Stage 1812:=============================================> (175 + 4) / 200]
> [Stage 1812:==============================================> (180 + 4) / 200]
> [Stage 1812:================================================> (185 + 4) / 200]
> [Stage 1812:=================================================> (189 + 5) / 200]
> [Stage 1812:===================================================>(197 + 3) / 200]
>
> ........................S.
> tests RDD function take() : ................
> the textFile() function : .............
> functions in utils.R : ......................................
> Windows-specific tests : S
> 1. Failure (at test_sparkSQL.R#1300): date functions on a DataFrame ------------
> collect(select(df2, minute(df2$b)))[, 1] not equal to c(34, 24)
> 2/2 mismatches (average diff: 30).
> First 2:
> pos x y diff
> 1 4 34 -30
> 2 54 24 30
> Error: Test failures
> Execution halted
> [31mHad test failures; see logs.
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org