You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hive.apache.org by br...@apache.org on 2014/11/29 04:44:28 UTC
svn commit: r1642395 [9/22] - in /hive/branches/spark/ql/src:
java/org/apache/hadoop/hive/ql/exec/spark/
java/org/apache/hadoop/hive/ql/exec/spark/session/
test/results/clientpositive/ test/results/clientpositive/spark/
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_cond_pushdown_unqual4.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_cond_pushdown_unqual4.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_cond_pushdown_unqual4.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_cond_pushdown_unqual4.q.out Sat Nov 29 03:44:22 2014
@@ -72,20 +72,20 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
- alias: p4
- Statistics: Num rows: 26 Data size: 3147 Basic stats: COMPLETE Column stats: NONE
+ alias: p2
+ Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Filter Operator
- predicate: p_name is not null (type: boolean)
- Statistics: Num rows: 13 Data size: 1573 Basic stats: COMPLETE Column stats: NONE
+ predicate: p2_name is not null (type: boolean)
+ Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
0 {p_partkey} {p_name} {p_mfgr} {p_brand} {p_type} {p_size} {p_container} {p_retailprice} {p_comment}
- 1 {p2_partkey} {p2_name} {p2_mfgr} {p2_brand} {p2_type} {p2_size} {p2_container} {p2_retailprice} {p2_comment}
+ 1 {p2_partkey} {p2_mfgr} {p2_brand} {p2_type} {p2_size} {p2_container} {p2_retailprice} {p2_comment}
2 {p3_partkey} {p3_name} {p3_mfgr} {p3_brand} {p3_type} {p3_size} {p3_container} {p3_retailprice} {p3_comment}
- 3 {p_partkey} {p_mfgr} {p_brand} {p_type} {p_size} {p_container} {p_retailprice} {p_comment}
+ 3 {p_partkey} {p_name} {p_mfgr} {p_brand} {p_type} {p_size} {p_container} {p_retailprice} {p_comment}
keys:
0 p_name (type: string)
1 p2_name (type: string)
@@ -93,7 +93,7 @@ STAGE PLANS:
3 p_name (type: string)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: p3
@@ -114,20 +114,20 @@ STAGE PLANS:
3 p_name (type: string)
Local Work:
Map Reduce Local Work
- Map 3
+ Map 4
Map Operator Tree:
TableScan
- alias: p2
- Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
+ alias: p4
+ Statistics: Num rows: 26 Data size: 3147 Basic stats: COMPLETE Column stats: NONE
Filter Operator
- predicate: p2_name is not null (type: boolean)
- Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
+ predicate: p_name is not null (type: boolean)
+ Statistics: Num rows: 13 Data size: 1573 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
0 {p_partkey} {p_name} {p_mfgr} {p_brand} {p_type} {p_size} {p_container} {p_retailprice} {p_comment}
- 1 {p2_partkey} {p2_mfgr} {p2_brand} {p2_type} {p2_size} {p2_container} {p2_retailprice} {p2_comment}
+ 1 {p2_partkey} {p2_name} {p2_mfgr} {p2_brand} {p2_type} {p2_size} {p2_container} {p2_retailprice} {p2_comment}
2 {p3_partkey} {p3_name} {p3_mfgr} {p3_brand} {p3_type} {p3_size} {p3_container} {p3_retailprice} {p3_comment}
- 3 {p_partkey} {p_name} {p_mfgr} {p_brand} {p_type} {p_size} {p_container} {p_retailprice} {p_comment}
+ 3 {p_partkey} {p_mfgr} {p_brand} {p_type} {p_size} {p_container} {p_retailprice} {p_comment}
keys:
0 p_name (type: string)
1 p2_name (type: string)
@@ -140,7 +140,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 4
+ Map 1
Map Operator Tree:
TableScan
alias: p1
@@ -165,9 +165,9 @@ STAGE PLANS:
3 p_name (type: string)
outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col12, _col13, _col14, _col15, _col16, _col17, _col18, _col19, _col20, _col24, _col25, _col26, _col27, _col28, _col29, _col30, _col31, _col32, _col36, _col37, _col38, _col39, _col40, _col41, _col42, _col43, _col44
input vertices:
- 1 Map 3
- 2 Map 2
- 3 Map 1
+ 1 Map 2
+ 2 Map 3
+ 3 Map 4
Statistics: Num rows: 42 Data size: 5190 Basic stats: COMPLETE Column stats: NONE
Filter Operator
predicate: ((_col13 = _col25) and (_col1 = _col37)) (type: boolean)
@@ -216,35 +216,35 @@ STAGE PLANS:
Map 2
Map Operator Tree:
TableScan
- alias: p3
+ alias: p2
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Filter Operator
- predicate: p3_name is not null (type: boolean)
+ predicate: (p2_name is not null and p2_partkey is not null) (type: boolean)
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {_col0} {_col1} {_col2} {_col3} {_col4} {_col5} {_col6} {_col7} {_col8} {_col12} {_col13} {_col14} {_col15} {_col16} {_col17} {_col18} {_col19} {_col20}
- 1 {p3_partkey} {p3_mfgr} {p3_brand} {p3_type} {p3_size} {p3_container} {p3_retailprice} {p3_comment}
+ 0 {p_partkey} {p_name} {p_mfgr} {p_brand} {p_type} {p_size} {p_container} {p_retailprice} {p_comment}
+ 1 {p2_mfgr} {p2_brand} {p2_type} {p2_size} {p2_container} {p2_retailprice} {p2_comment}
keys:
- 0 _col13 (type: string)
- 1 p3_name (type: string)
+ 0 p_name (type: string), p_partkey (type: int)
+ 1 p2_name (type: string), p2_partkey (type: int)
Local Work:
Map Reduce Local Work
Map 3
Map Operator Tree:
TableScan
- alias: p2
+ alias: p3
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Filter Operator
- predicate: (p2_name is not null and p2_partkey is not null) (type: boolean)
+ predicate: p3_name is not null (type: boolean)
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {p_partkey} {p_name} {p_mfgr} {p_brand} {p_type} {p_size} {p_container} {p_retailprice} {p_comment}
- 1 {p2_mfgr} {p2_brand} {p2_type} {p2_size} {p2_container} {p2_retailprice} {p2_comment}
+ 0 {_col0} {_col1} {_col2} {_col3} {_col4} {_col5} {_col6} {_col7} {_col8} {_col12} {_col13} {_col14} {_col15} {_col16} {_col17} {_col18} {_col19} {_col20}
+ 1 {p3_partkey} {p3_mfgr} {p3_brand} {p3_type} {p3_size} {p3_container} {p3_retailprice} {p3_comment}
keys:
- 0 p_name (type: string), p_partkey (type: int)
- 1 p2_name (type: string), p2_partkey (type: int)
+ 0 _col13 (type: string)
+ 1 p3_name (type: string)
Local Work:
Map Reduce Local Work
@@ -252,7 +252,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 4
+ Map 1
Map Operator Tree:
TableScan
alias: p1
@@ -271,7 +271,7 @@ STAGE PLANS:
1 p2_name (type: string), p2_partkey (type: int)
outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col12, _col13, _col14, _col15, _col16, _col17, _col18, _col19, _col20
input vertices:
- 1 Map 3
+ 1 Map 2
Statistics: Num rows: 7 Data size: 931 Basic stats: COMPLETE Column stats: NONE
Map Join Operator
condition map:
@@ -284,7 +284,7 @@ STAGE PLANS:
1 p3_name (type: string)
outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col12, _col13, _col14, _col15, _col16, _col17, _col18, _col19, _col20, _col24, _col25, _col26, _col27, _col28, _col29, _col30, _col31, _col32
input vertices:
- 1 Map 2
+ 1 Map 3
Statistics: Num rows: 7 Data size: 1024 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
@@ -300,7 +300,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 4
Map Operator Tree:
TableScan
alias: p4
@@ -319,7 +319,7 @@ STAGE PLANS:
1 p_partkey (type: int)
outputColumnNames: _col0, _col1, _col2, _col3, _col4, _col5, _col6, _col7, _col8, _col12, _col13, _col14, _col15, _col16, _col17, _col18, _col19, _col20, _col24, _col25, _col26, _col27, _col28, _col29, _col30, _col31, _col32, _col36, _col37, _col38, _col39, _col40, _col41, _col42, _col43, _col44
input vertices:
- 0 Map 4
+ 0 Map 1
Statistics: Num rows: 14 Data size: 1730 Basic stats: COMPLETE Column stats: NONE
Filter Operator
predicate: (((_col13 = _col25) and (_col0 = _col36)) and (_col0 = _col12)) (type: boolean)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_filters_overlap.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_filters_overlap.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
Files hive/branches/spark/ql/src/test/results/clientpositive/spark/join_filters_overlap.q.out (original) and hive/branches/spark/ql/src/test/results/clientpositive/spark/join_filters_overlap.q.out Sat Nov 29 03:44:22 2014 differ
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_hive_626.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_hive_626.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_hive_626.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_hive_626.q.out Sat Nov 29 03:44:22 2014
@@ -77,35 +77,35 @@ STAGE PLANS:
Map 2
Map Operator Tree:
TableScan
- alias: hive_count
- Statistics: Num rows: 0 Data size: 5 Basic stats: PARTIAL Column stats: NONE
+ alias: hive_bar
+ Statistics: Num rows: 0 Data size: 23 Basic stats: PARTIAL Column stats: NONE
Filter Operator
- predicate: bar_id is not null (type: boolean)
+ predicate: (foo_id is not null and bar_id is not null) (type: boolean)
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {_col1} {_col13}
- 1 {n}
+ 0 {foo_name}
+ 1 {bar_id} {bar_name}
keys:
- 0 _col9 (type: int)
- 1 bar_id (type: int)
+ 0 foo_id (type: int)
+ 1 foo_id (type: int)
Local Work:
Map Reduce Local Work
Map 3
Map Operator Tree:
TableScan
- alias: hive_bar
- Statistics: Num rows: 0 Data size: 23 Basic stats: PARTIAL Column stats: NONE
+ alias: hive_count
+ Statistics: Num rows: 0 Data size: 5 Basic stats: PARTIAL Column stats: NONE
Filter Operator
- predicate: (foo_id is not null and bar_id is not null) (type: boolean)
+ predicate: bar_id is not null (type: boolean)
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {foo_name}
- 1 {bar_id} {bar_name}
+ 0 {_col1} {_col13}
+ 1 {n}
keys:
- 0 foo_id (type: int)
- 1 foo_id (type: int)
+ 0 _col9 (type: int)
+ 1 bar_id (type: int)
Local Work:
Map Reduce Local Work
@@ -132,7 +132,7 @@ STAGE PLANS:
1 foo_id (type: int)
outputColumnNames: _col1, _col9, _col13
input vertices:
- 1 Map 3
+ 1 Map 2
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Map Join Operator
condition map:
@@ -145,7 +145,7 @@ STAGE PLANS:
1 bar_id (type: int)
outputColumnNames: _col1, _col13, _col22
input vertices:
- 1 Map 2
+ 1 Map 3
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Select Operator
expressions: _col1 (type: string), _col13 (type: string), _col22 (type: int)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_map_ppr.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_map_ppr.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
Files hive/branches/spark/ql/src/test/results/clientpositive/spark/join_map_ppr.q.out (original) and hive/branches/spark/ql/src/test/results/clientpositive/spark/join_map_ppr.q.out Sat Nov 29 03:44:22 2014 differ
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_merge_multi_expressions.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_merge_multi_expressions.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_merge_multi_expressions.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_merge_multi_expressions.q.out Sat Nov 29 03:44:22 2014
@@ -14,7 +14,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 3
Map Operator Tree:
TableScan
alias: b
@@ -33,7 +33,7 @@ STAGE PLANS:
2 key (type: string), hr (type: string)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 4
Map Operator Tree:
TableScan
alias: c
@@ -56,10 +56,10 @@ STAGE PLANS:
Stage: Stage-1
Spark
Edges:
- Reducer 4 <- Map 3 (GROUP, 1)
+ Reducer 2 <- Map 1 (GROUP, 1)
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -80,23 +80,21 @@ STAGE PLANS:
1 key (type: string), hr (type: string)
2 key (type: string), hr (type: string)
input vertices:
- 1 Map 1
- 2 Map 2
+ 1 Map 3
+ 2 Map 4
Statistics: Num rows: 2200 Data size: 23372 Basic stats: COMPLETE Column stats: NONE
- Select Operator
- Statistics: Num rows: 2200 Data size: 23372 Basic stats: COMPLETE Column stats: NONE
- Group By Operator
- aggregations: count()
- mode: hash
- outputColumnNames: _col0
+ Group By Operator
+ aggregations: count()
+ mode: hash
+ outputColumnNames: _col0
+ Statistics: Num rows: 1 Data size: 8 Basic stats: COMPLETE Column stats: NONE
+ Reduce Output Operator
+ sort order:
Statistics: Num rows: 1 Data size: 8 Basic stats: COMPLETE Column stats: NONE
- Reduce Output Operator
- sort order:
- Statistics: Num rows: 1 Data size: 8 Basic stats: COMPLETE Column stats: NONE
- value expressions: _col0 (type: bigint)
+ value expressions: _col0 (type: bigint)
Local Work:
Map Reduce Local Work
- Reducer 4
+ Reducer 2
Reduce Operator Tree:
Group By Operator
aggregations: count(VALUE._col0)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_merging.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_merging.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_merging.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_merging.q.out Sat Nov 29 03:44:22 2014
@@ -18,23 +18,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
- Map Operator Tree:
- TableScan
- alias: p2
- Statistics: Num rows: 26 Data size: 3147 Basic stats: COMPLETE Column stats: NONE
- Spark HashTable Sink Operator
- condition expressions:
- 0 {p_size}
- 1 {p_size}
- 2
- keys:
- 0 p_partkey (type: int)
- 1 p_partkey (type: int)
- 2 p_partkey (type: int)
- Local Work:
- Map Reduce Local Work
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: p1
@@ -53,12 +37,28 @@ STAGE PLANS:
2 p_partkey (type: int)
Local Work:
Map Reduce Local Work
+ Map 2
+ Map Operator Tree:
+ TableScan
+ alias: p2
+ Statistics: Num rows: 26 Data size: 3147 Basic stats: COMPLETE Column stats: NONE
+ Spark HashTable Sink Operator
+ condition expressions:
+ 0 {p_size}
+ 1 {p_size}
+ 2
+ keys:
+ 0 p_partkey (type: int)
+ 1 p_partkey (type: int)
+ 2 p_partkey (type: int)
+ Local Work:
+ Map Reduce Local Work
Stage: Stage-1
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 3
Map Operator Tree:
TableScan
alias: p3
@@ -77,7 +77,7 @@ STAGE PLANS:
2 p_partkey (type: int)
outputColumnNames: _col5, _col17
input vertices:
- 0 Map 3
+ 0 Map 1
1 Map 2
Statistics: Num rows: 57 Data size: 6923 Basic stats: COMPLETE Column stats: NONE
Select Operator
@@ -120,23 +120,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
- Map Operator Tree:
- TableScan
- alias: p2
- Statistics: Num rows: 26 Data size: 3147 Basic stats: COMPLETE Column stats: NONE
- Spark HashTable Sink Operator
- condition expressions:
- 0 {p_size}
- 1 {p_size}
- 2
- keys:
- 0 p_partkey (type: int)
- 1 p_partkey (type: int)
- 2 p_partkey (type: int)
- Local Work:
- Map Reduce Local Work
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: p1
@@ -155,12 +139,28 @@ STAGE PLANS:
2 p_partkey (type: int)
Local Work:
Map Reduce Local Work
+ Map 2
+ Map Operator Tree:
+ TableScan
+ alias: p2
+ Statistics: Num rows: 26 Data size: 3147 Basic stats: COMPLETE Column stats: NONE
+ Spark HashTable Sink Operator
+ condition expressions:
+ 0 {p_size}
+ 1 {p_size}
+ 2
+ keys:
+ 0 p_partkey (type: int)
+ 1 p_partkey (type: int)
+ 2 p_partkey (type: int)
+ Local Work:
+ Map Reduce Local Work
Stage: Stage-1
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 3
Map Operator Tree:
TableScan
alias: p3
@@ -179,7 +179,7 @@ STAGE PLANS:
2 p_partkey (type: int)
outputColumnNames: _col5, _col17
input vertices:
- 0 Map 3
+ 0 Map 1
1 Map 2
Statistics: Num rows: 57 Data size: 6923 Basic stats: COMPLETE Column stats: NONE
Filter Operator
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_nullsafe.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_nullsafe.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_nullsafe.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_nullsafe.q.out Sat Nov 29 03:44:22 2014
@@ -34,7 +34,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -53,7 +53,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -70,7 +70,7 @@ STAGE PLANS:
nullSafes: [true]
outputColumnNames: _col0, _col1, _col5, _col6
input vertices:
- 1 Map 1
+ 1 Map 2
Statistics: Num rows: 3 Data size: 28 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col5 (type: int), _col6 (type: int)
@@ -125,7 +125,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -144,7 +144,7 @@ STAGE PLANS:
2 key (type: int)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: c
@@ -168,7 +168,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -190,8 +190,8 @@ STAGE PLANS:
2 key (type: int)
outputColumnNames: _col0, _col1, _col5, _col6, _col10, _col11
input vertices:
- 1 Map 1
- 2 Map 2
+ 1 Map 2
+ 2 Map 3
Statistics: Num rows: 4 Data size: 37 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col5 (type: int), _col6 (type: int), _col10 (type: int), _col11 (type: int)
@@ -237,7 +237,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -253,7 +253,7 @@ STAGE PLANS:
2 key (type: int)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: c
@@ -274,7 +274,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -294,8 +294,8 @@ STAGE PLANS:
nullSafes: [true]
outputColumnNames: _col0, _col1, _col5, _col6, _col10, _col11
input vertices:
- 1 Map 1
- 2 Map 2
+ 1 Map 2
+ 2 Map 3
Statistics: Num rows: 6 Data size: 57 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col5 (type: int), _col6 (type: int), _col10 (type: int), _col11 (type: int)
@@ -368,7 +368,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -387,7 +387,7 @@ STAGE PLANS:
2 key (type: int), value (type: int)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: c
@@ -411,7 +411,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -434,8 +434,8 @@ STAGE PLANS:
nullSafes: [true, false]
outputColumnNames: _col0, _col1, _col5, _col6, _col10, _col11
input vertices:
- 1 Map 1
- 2 Map 2
+ 1 Map 2
+ 2 Map 3
Statistics: Num rows: 4 Data size: 37 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col5 (type: int), _col6 (type: int), _col10 (type: int), _col11 (type: int)
@@ -481,7 +481,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -497,7 +497,7 @@ STAGE PLANS:
2 key (type: int), value (type: int)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: c
@@ -518,7 +518,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -538,8 +538,8 @@ STAGE PLANS:
nullSafes: [true, true]
outputColumnNames: _col0, _col1, _col5, _col6, _col10, _col11
input vertices:
- 1 Map 1
- 2 Map 2
+ 1 Map 2
+ 2 Map 3
Statistics: Num rows: 6 Data size: 57 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col5 (type: int), _col6 (type: int), _col10 (type: int), _col11 (type: int)
@@ -1638,7 +1638,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -1660,7 +1660,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -1680,7 +1680,7 @@ STAGE PLANS:
nullSafes: [true]
outputColumnNames: _col1, _col5
input vertices:
- 1 Map 1
+ 1 Map 2
Statistics: Num rows: 1 Data size: 8 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: null (type: void), _col1 (type: int), _col5 (type: int), null (type: void)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_rc.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_rc.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_rc.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_rc.q.out Sat Nov 29 03:44:22 2014
@@ -56,7 +56,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: join_rc2
@@ -78,7 +78,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 1
Map Operator Tree:
TableScan
alias: join_rc1
@@ -97,7 +97,7 @@ STAGE PLANS:
1 key (type: string)
outputColumnNames: _col0, _col6
input vertices:
- 1 Map 1
+ 1 Map 2
Statistics: Num rows: 275 Data size: 2646 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col6 (type: string)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder.q.out Sat Nov 29 03:44:22 2014
@@ -66,7 +66,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -88,7 +88,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: c
@@ -107,7 +107,7 @@ STAGE PLANS:
1 (key + 1) (type: double)
outputColumnNames: _col0, _col1, _col5
input vertices:
- 0 Map 2
+ 0 Map 1
Statistics: Num rows: 275 Data size: 2921 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col1 (type: string), _col5 (type: string)
@@ -145,7 +145,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -167,7 +167,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: c
@@ -186,7 +186,7 @@ STAGE PLANS:
1 (key + 1) (type: double)
outputColumnNames: _col0, _col1, _col5
input vertices:
- 0 Map 2
+ 0 Map 1
Statistics: Num rows: 275 Data size: 2921 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col1 (type: string), _col5 (type: string)
@@ -261,7 +261,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -280,7 +280,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -296,7 +296,7 @@ STAGE PLANS:
1 key (type: string)
outputColumnNames: _col0, _col1, _col5
input vertices:
- 1 Map 1
+ 1 Map 2
Statistics: Num rows: 0 Data size: 33 Basic stats: PARTIAL Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
@@ -312,7 +312,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: c
@@ -328,7 +328,7 @@ STAGE PLANS:
1 val (type: string)
outputColumnNames: _col0, _col1, _col5, _col11
input vertices:
- 0 Map 3
+ 0 Map 1
Statistics: Num rows: 0 Data size: 36 Basic stats: PARTIAL Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col5 (type: string), _col1 (type: string), _col11 (type: string)
@@ -371,7 +371,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -390,7 +390,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -406,7 +406,7 @@ STAGE PLANS:
1 key (type: string)
outputColumnNames: _col0, _col1, _col5
input vertices:
- 1 Map 1
+ 1 Map 2
Statistics: Num rows: 0 Data size: 33 Basic stats: PARTIAL Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
@@ -422,7 +422,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: c
@@ -438,7 +438,7 @@ STAGE PLANS:
1 val (type: string)
outputColumnNames: _col0, _col1, _col5, _col11
input vertices:
- 0 Map 3
+ 0 Map 1
Statistics: Num rows: 0 Data size: 36 Basic stats: PARTIAL Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col5 (type: string), _col1 (type: string), _col11 (type: string)
@@ -530,7 +530,7 @@ STAGE PLANS:
Map 1
Map Operator Tree:
TableScan
- alias: b
+ alias: a
Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Reduce Output Operator
key expressions: key (type: string), val (type: string)
@@ -540,23 +540,23 @@ STAGE PLANS:
Map 3
Map Operator Tree:
TableScan
- alias: c
- Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
+ alias: b
+ Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Reduce Output Operator
key expressions: key (type: string), val (type: string)
sort order: ++
Map-reduce partition columns: key (type: string), val (type: string)
- Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
+ Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Map 4
Map Operator Tree:
TableScan
- alias: a
- Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
+ alias: c
+ Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
Reduce Output Operator
key expressions: key (type: string), val (type: string)
sort order: ++
Map-reduce partition columns: key (type: string), val (type: string)
- Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
+ Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
Reducer 2
Reduce Operator Tree:
Join Operator
@@ -614,7 +614,7 @@ STAGE PLANS:
Map 1
Map Operator Tree:
TableScan
- alias: b
+ alias: a
Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Reduce Output Operator
key expressions: key (type: string), val (type: string)
@@ -624,23 +624,23 @@ STAGE PLANS:
Map 3
Map Operator Tree:
TableScan
- alias: c
- Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
+ alias: b
+ Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Reduce Output Operator
key expressions: key (type: string), val (type: string)
sort order: ++
Map-reduce partition columns: key (type: string), val (type: string)
- Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
+ Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Map 4
Map Operator Tree:
TableScan
- alias: a
- Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
+ alias: c
+ Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
Reduce Output Operator
key expressions: key (type: string), val (type: string)
sort order: ++
Map-reduce partition columns: key (type: string), val (type: string)
- Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
+ Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
Reducer 2
Reduce Operator Tree:
Join Operator
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder2.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder2.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder2.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder2.q.out Sat Nov 29 03:44:22 2014
@@ -84,10 +84,10 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
- alias: d
+ alias: b
Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Filter Operator
predicate: key is not null (type: boolean)
@@ -95,9 +95,9 @@ STAGE PLANS:
Spark HashTable Sink Operator
condition expressions:
0 {key} {val}
- 1 {key} {val}
+ 1 {val}
2 {key} {val}
- 3 {val}
+ 3 {key} {val}
keys:
0 key (type: string)
1 key (type: string)
@@ -105,19 +105,19 @@ STAGE PLANS:
3 key (type: string)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
- alias: b
- Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
+ alias: c
+ Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
Filter Operator
predicate: key is not null (type: boolean)
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
0 {key} {val}
- 1 {val}
- 2 {key} {val}
+ 1 {key} {val}
+ 2 {val}
3 {key} {val}
keys:
0 key (type: string)
@@ -126,11 +126,11 @@ STAGE PLANS:
3 key (type: string)
Local Work:
Map Reduce Local Work
- Map 3
+ Map 4
Map Operator Tree:
TableScan
- alias: c
- Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
+ alias: d
+ Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Filter Operator
predicate: key is not null (type: boolean)
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
@@ -138,8 +138,8 @@ STAGE PLANS:
condition expressions:
0 {key} {val}
1 {key} {val}
- 2 {val}
- 3 {key} {val}
+ 2 {key} {val}
+ 3 {val}
keys:
0 key (type: string)
1 key (type: string)
@@ -152,7 +152,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 4
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -179,7 +179,7 @@ STAGE PLANS:
input vertices:
1 Map 2
2 Map 3
- 3 Map 1
+ 3 Map 4
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col1 (type: string), _col5 (type: string), _col6 (type: string), _col10 (type: string), _col11 (type: string), _col15 (type: string), _col16 (type: string)
@@ -244,23 +244,6 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
- Map Operator Tree:
- TableScan
- alias: d
- Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
- Filter Operator
- predicate: (key + 1) is not null (type: boolean)
- Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
- Spark HashTable Sink Operator
- condition expressions:
- 0 {_col0} {_col1} {_col5} {_col6} {_col10} {_col11}
- 1 {key} {val}
- keys:
- 0 (_col0 + 1) (type: double)
- 1 (key + 1) (type: double)
- Local Work:
- Map Reduce Local Work
Map 2
Map Operator Tree:
TableScan
@@ -295,12 +278,29 @@ STAGE PLANS:
1 val (type: string)
Local Work:
Map Reduce Local Work
+ Map 4
+ Map Operator Tree:
+ TableScan
+ alias: d
+ Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
+ Filter Operator
+ predicate: (key + 1) is not null (type: boolean)
+ Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
+ Spark HashTable Sink Operator
+ condition expressions:
+ 0 {_col0} {_col1} {_col5} {_col6} {_col10} {_col11}
+ 1 {key} {val}
+ keys:
+ 0 (_col0 + 1) (type: double)
+ 1 (key + 1) (type: double)
+ Local Work:
+ Map Reduce Local Work
Stage: Stage-1
Spark
#### A masked pattern was here ####
Vertices:
- Map 4
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -345,7 +345,7 @@ STAGE PLANS:
1 (key + 1) (type: double)
outputColumnNames: _col0, _col1, _col5, _col6, _col10, _col11, _col15, _col16
input vertices:
- 1 Map 1
+ 1 Map 4
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col1 (type: string), _col5 (type: string), _col6 (type: string), _col10 (type: string), _col11 (type: string), _col15 (type: string), _col16 (type: string)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder3.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder3.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder3.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder3.q.out Sat Nov 29 03:44:22 2014
@@ -84,10 +84,10 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
- alias: d
+ alias: b
Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Filter Operator
predicate: key is not null (type: boolean)
@@ -95,9 +95,9 @@ STAGE PLANS:
Spark HashTable Sink Operator
condition expressions:
0 {key} {val}
- 1 {key} {val}
+ 1 {val}
2 {key} {val}
- 3 {val}
+ 3 {key} {val}
keys:
0 key (type: string)
1 key (type: string)
@@ -105,19 +105,19 @@ STAGE PLANS:
3 key (type: string)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
- alias: b
- Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
+ alias: c
+ Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
Filter Operator
predicate: key is not null (type: boolean)
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
0 {key} {val}
- 1 {val}
- 2 {key} {val}
+ 1 {key} {val}
+ 2 {val}
3 {key} {val}
keys:
0 key (type: string)
@@ -126,11 +126,11 @@ STAGE PLANS:
3 key (type: string)
Local Work:
Map Reduce Local Work
- Map 3
+ Map 4
Map Operator Tree:
TableScan
- alias: c
- Statistics: Num rows: 0 Data size: 20 Basic stats: PARTIAL Column stats: NONE
+ alias: d
+ Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
Filter Operator
predicate: key is not null (type: boolean)
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
@@ -138,8 +138,8 @@ STAGE PLANS:
condition expressions:
0 {key} {val}
1 {key} {val}
- 2 {val}
- 3 {key} {val}
+ 2 {key} {val}
+ 3 {val}
keys:
0 key (type: string)
1 key (type: string)
@@ -152,7 +152,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 4
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -179,7 +179,7 @@ STAGE PLANS:
input vertices:
1 Map 2
2 Map 3
- 3 Map 1
+ 3 Map 4
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col1 (type: string), _col5 (type: string), _col6 (type: string), _col10 (type: string), _col11 (type: string), _col15 (type: string), _col16 (type: string)
@@ -244,23 +244,6 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
- Map Operator Tree:
- TableScan
- alias: d
- Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
- Filter Operator
- predicate: (key + 1) is not null (type: boolean)
- Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
- Spark HashTable Sink Operator
- condition expressions:
- 0 {_col0} {_col1} {_col5} {_col6} {_col10} {_col11}
- 1 {key} {val}
- keys:
- 0 (_col0 + 1) (type: double)
- 1 (key + 1) (type: double)
- Local Work:
- Map Reduce Local Work
Map 2
Map Operator Tree:
TableScan
@@ -295,12 +278,29 @@ STAGE PLANS:
1 val (type: string)
Local Work:
Map Reduce Local Work
+ Map 4
+ Map Operator Tree:
+ TableScan
+ alias: d
+ Statistics: Num rows: 0 Data size: 30 Basic stats: PARTIAL Column stats: NONE
+ Filter Operator
+ predicate: (key + 1) is not null (type: boolean)
+ Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
+ Spark HashTable Sink Operator
+ condition expressions:
+ 0 {_col0} {_col1} {_col5} {_col6} {_col10} {_col11}
+ 1 {key} {val}
+ keys:
+ 0 (_col0 + 1) (type: double)
+ 1 (key + 1) (type: double)
+ Local Work:
+ Map Reduce Local Work
Stage: Stage-1
Spark
#### A masked pattern was here ####
Vertices:
- Map 4
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -345,7 +345,7 @@ STAGE PLANS:
1 (key + 1) (type: double)
outputColumnNames: _col0, _col1, _col5, _col6, _col10, _col11, _col15, _col16
input vertices:
- 1 Map 1
+ 1 Map 4
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col1 (type: string), _col5 (type: string), _col6 (type: string), _col10 (type: string), _col11 (type: string), _col15 (type: string), _col16 (type: string)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder4.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder4.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder4.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_reorder4.q.out Sat Nov 29 03:44:22 2014
@@ -60,7 +60,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -79,7 +79,7 @@ STAGE PLANS:
2 key3 (type: string)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: c
@@ -103,7 +103,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -125,8 +125,8 @@ STAGE PLANS:
2 key3 (type: string)
outputColumnNames: _col0, _col1, _col5, _col6, _col10, _col11
input vertices:
- 1 Map 1
- 2 Map 2
+ 1 Map 2
+ 2 Map 3
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col1 (type: string), _col5 (type: string), _col6 (type: string), _col10 (type: string), _col11 (type: string)
@@ -175,7 +175,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -194,7 +194,7 @@ STAGE PLANS:
2 key3 (type: string)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: c
@@ -218,7 +218,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -240,8 +240,8 @@ STAGE PLANS:
2 key3 (type: string)
outputColumnNames: _col0, _col1, _col5, _col6, _col10, _col11
input vertices:
- 1 Map 1
- 2 Map 2
+ 1 Map 2
+ 2 Map 3
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col1 (type: string), _col5 (type: string), _col6 (type: string), _col10 (type: string), _col11 (type: string)
@@ -290,7 +290,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: b
@@ -309,7 +309,7 @@ STAGE PLANS:
2 key3 (type: string)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: c
@@ -333,7 +333,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: a
@@ -355,8 +355,8 @@ STAGE PLANS:
2 key3 (type: string)
outputColumnNames: _col0, _col1, _col5, _col6, _col10, _col11
input vertices:
- 1 Map 1
- 2 Map 2
+ 1 Map 2
+ 2 Map 3
Statistics: Num rows: 0 Data size: 0 Basic stats: NONE Column stats: NONE
Select Operator
expressions: _col0 (type: string), _col1 (type: string), _col5 (type: string), _col6 (type: string), _col10 (type: string), _col11 (type: string)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_star.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_star.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_star.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_star.q.out Sat Nov 29 03:44:22 2014
@@ -140,7 +140,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: dim1
@@ -162,7 +162,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 1
Map Operator Tree:
TableScan
alias: fact
@@ -181,7 +181,7 @@ STAGE PLANS:
1 f1 (type: int)
outputColumnNames: _col0, _col1, _col8
input vertices:
- 1 Map 1
+ 1 Map 2
Statistics: Num rows: 4 Data size: 53 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col8 (type: int)
@@ -237,38 +237,38 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
- alias: dim2
+ alias: dim1
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Filter Operator
- predicate: f3 is not null (type: boolean)
+ predicate: f1 is not null (type: boolean)
Statistics: Num rows: 1 Data size: 8 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {_col0} {_col1} {_col8}
- 1 {f4}
+ 0 {m1} {m2} {d2}
+ 1 {f2}
keys:
- 0 _col3 (type: int)
- 1 f3 (type: int)
+ 0 d1 (type: int)
+ 1 f1 (type: int)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
- alias: dim1
+ alias: dim2
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Filter Operator
- predicate: f1 is not null (type: boolean)
+ predicate: f3 is not null (type: boolean)
Statistics: Num rows: 1 Data size: 8 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {m1} {m2} {d2}
- 1 {f2}
+ 0 {_col0} {_col1} {_col8}
+ 1 {f4}
keys:
- 0 d1 (type: int)
- 1 f1 (type: int)
+ 0 _col3 (type: int)
+ 1 f3 (type: int)
Local Work:
Map Reduce Local Work
@@ -276,7 +276,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: fact
@@ -308,7 +308,7 @@ STAGE PLANS:
1 f3 (type: int)
outputColumnNames: _col0, _col1, _col8, _col13
input vertices:
- 1 Map 1
+ 1 Map 3
Statistics: Num rows: 2 Data size: 38 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col8 (type: int), _col13 (type: int)
@@ -366,38 +366,38 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
- alias: dim2
+ alias: dim1
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Filter Operator
- predicate: f3 is not null (type: boolean)
+ predicate: (f1 is not null and f2 is not null) (type: boolean)
Statistics: Num rows: 1 Data size: 8 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {_col0} {_col1} {_col8}
- 1 {f4}
+ 0 {m1} {m2}
+ 1 {f2}
keys:
- 0 _col8 (type: int)
- 1 f3 (type: int)
+ 0 d1 (type: int)
+ 1 f1 (type: int)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
- alias: dim1
+ alias: dim2
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Filter Operator
- predicate: (f1 is not null and f2 is not null) (type: boolean)
+ predicate: f3 is not null (type: boolean)
Statistics: Num rows: 1 Data size: 8 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {m1} {m2}
- 1 {f2}
+ 0 {_col0} {_col1} {_col8}
+ 1 {f4}
keys:
- 0 d1 (type: int)
- 1 f1 (type: int)
+ 0 _col8 (type: int)
+ 1 f3 (type: int)
Local Work:
Map Reduce Local Work
@@ -405,7 +405,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: fact
@@ -437,7 +437,7 @@ STAGE PLANS:
1 f3 (type: int)
outputColumnNames: _col0, _col1, _col8, _col13
input vertices:
- 1 Map 1
+ 1 Map 3
Statistics: Num rows: 4 Data size: 58 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col8 (type: int), _col13 (type: int)
@@ -495,32 +495,32 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
- alias: dim2
+ alias: dim1
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {_col0} {_col1} {_col8}
- 1 {f4}
+ 0 {m1} {m2}
+ 1 {f2}
keys:
- 0 _col8 (type: int)
- 1 f3 (type: int)
+ 0 d1 (type: int)
+ 1 f1 (type: int)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
- alias: dim1
+ alias: dim2
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {m1} {m2}
- 1 {f2}
+ 0 {_col0} {_col1} {_col8}
+ 1 {f4}
keys:
- 0 d1 (type: int)
- 1 f1 (type: int)
+ 0 _col8 (type: int)
+ 1 f3 (type: int)
Local Work:
Map Reduce Local Work
@@ -528,7 +528,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: fact
@@ -557,7 +557,7 @@ STAGE PLANS:
1 f3 (type: int)
outputColumnNames: _col0, _col1, _col8, _col13
input vertices:
- 1 Map 1
+ 1 Map 3
Statistics: Num rows: 8 Data size: 117 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col8 (type: int), _col13 (type: int)
@@ -631,21 +631,21 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
- alias: dim3
+ alias: dim1
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {_col0} {_col1} {_col8} {_col13}
- 1 {f6}
+ 0 {m1} {m2} {d2}
+ 1 {f2}
keys:
- 0 _col3 (type: int)
- 1 f5 (type: int)
+ 0 d1 (type: int)
+ 1 f1 (type: int)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 3
Map Operator Tree:
TableScan
alias: dim2
@@ -659,38 +659,24 @@ STAGE PLANS:
1 f3 (type: int)
Local Work:
Map Reduce Local Work
- Map 3
+ Map 4
Map Operator Tree:
TableScan
- alias: dim1
+ alias: dim3
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
- 0 {m1} {m2} {d2}
- 1 {f2}
+ 0 {_col0} {_col1} {_col8} {_col13}
+ 1 {f6}
keys:
- 0 d1 (type: int)
- 1 f1 (type: int)
+ 0 _col3 (type: int)
+ 1 f5 (type: int)
Local Work:
Map Reduce Local Work
Map 5
Map Operator Tree:
TableScan
- alias: dim7
- Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
- Spark HashTable Sink Operator
- condition expressions:
- 0 {_col0} {_col1} {_col8} {_col13} {_col18} {_col23} {_col28} {_col33}
- 1 {f14}
- keys:
- 0 _col28 (type: int)
- 1 f13 (type: int)
- Local Work:
- Map Reduce Local Work
- Map 6
- Map Operator Tree:
- TableScan
- alias: dim6
+ alias: dim4
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
@@ -703,7 +689,7 @@ STAGE PLANS:
2 f11 (type: int)
Local Work:
Map Reduce Local Work
- Map 7
+ Map 6
Map Operator Tree:
TableScan
alias: dim5
@@ -717,10 +703,10 @@ STAGE PLANS:
1 f9 (type: int)
Local Work:
Map Reduce Local Work
- Map 8
+ Map 7
Map Operator Tree:
TableScan
- alias: dim4
+ alias: dim6
Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
@@ -733,12 +719,26 @@ STAGE PLANS:
2 f11 (type: int)
Local Work:
Map Reduce Local Work
+ Map 8
+ Map Operator Tree:
+ TableScan
+ alias: dim7
+ Statistics: Num rows: 2 Data size: 16 Basic stats: COMPLETE Column stats: NONE
+ Spark HashTable Sink Operator
+ condition expressions:
+ 0 {_col0} {_col1} {_col8} {_col13} {_col18} {_col23} {_col28} {_col33}
+ 1 {f14}
+ keys:
+ 0 _col28 (type: int)
+ 1 f13 (type: int)
+ Local Work:
+ Map Reduce Local Work
Stage: Stage-1
Spark
#### A masked pattern was here ####
Vertices:
- Map 4
+ Map 1
Map Operator Tree:
TableScan
alias: fact
@@ -754,7 +754,7 @@ STAGE PLANS:
1 f1 (type: int)
outputColumnNames: _col0, _col1, _col3, _col8
input vertices:
- 1 Map 3
+ 1 Map 2
Statistics: Num rows: 6 Data size: 107 Basic stats: COMPLETE Column stats: NONE
Map Join Operator
condition map:
@@ -767,7 +767,7 @@ STAGE PLANS:
1 f3 (type: int)
outputColumnNames: _col0, _col1, _col3, _col8, _col13
input vertices:
- 1 Map 2
+ 1 Map 3
Statistics: Num rows: 6 Data size: 117 Basic stats: COMPLETE Column stats: NONE
Map Join Operator
condition map:
@@ -780,7 +780,7 @@ STAGE PLANS:
1 f5 (type: int)
outputColumnNames: _col0, _col1, _col8, _col13, _col18
input vertices:
- 1 Map 1
+ 1 Map 4
Statistics: Num rows: 6 Data size: 128 Basic stats: COMPLETE Column stats: NONE
Map Join Operator
condition map:
@@ -796,8 +796,8 @@ STAGE PLANS:
2 f11 (type: int)
outputColumnNames: _col0, _col1, _col8, _col13, _col18, _col23, _col28
input vertices:
- 1 Map 8
- 2 Map 6
+ 1 Map 5
+ 2 Map 7
Statistics: Num rows: 13 Data size: 281 Basic stats: COMPLETE Column stats: NONE
Map Join Operator
condition map:
@@ -810,7 +810,7 @@ STAGE PLANS:
1 f9 (type: int)
outputColumnNames: _col0, _col1, _col8, _col13, _col18, _col23, _col28, _col33
input vertices:
- 1 Map 7
+ 1 Map 6
Statistics: Num rows: 14 Data size: 309 Basic stats: COMPLETE Column stats: NONE
Map Join Operator
condition map:
@@ -823,7 +823,7 @@ STAGE PLANS:
1 f13 (type: int)
outputColumnNames: _col0, _col1, _col8, _col13, _col18, _col23, _col28, _col33, _col38
input vertices:
- 1 Map 5
+ 1 Map 8
Statistics: Num rows: 15 Data size: 339 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col1 (type: int), _col8 (type: int), _col13 (type: int), _col18 (type: int), _col23 (type: int), _col33 (type: int), _col28 (type: int), _col38 (type: int)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_thrift.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_thrift.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_thrift.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_thrift.q.out Sat Nov 29 03:44:22 2014
@@ -40,7 +40,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: s2
@@ -62,7 +62,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 1
Map Operator Tree:
TableScan
alias: s1
@@ -81,7 +81,7 @@ STAGE PLANS:
1 aint (type: int)
outputColumnNames: _col0, _col17
input vertices:
- 1 Map 1
+ 1 Map 2
Statistics: Num rows: 6 Data size: 1841 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col0 (type: int), _col17 (type: array<struct<myint:int,mystring:string,underscore_int:int>>)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_vc.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_vc.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_vc.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_vc.q.out Sat Nov 29 03:44:22 2014
@@ -20,48 +20,48 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 3
Map Operator Tree:
TableScan
- alias: t3
+ alias: t2
Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE
Filter Operator
- predicate: value is not null (type: boolean)
- Statistics: Num rows: 250 Data size: 2656 Basic stats: COMPLETE Column stats: NONE
+ predicate: (key is not null and value is not null) (type: boolean)
+ Statistics: Num rows: 125 Data size: 1328 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
0
- 1 {key} {BLOCK__OFFSET__INSIDE__FILE}
+ 1 {value}
keys:
- 0 _col6 (type: string)
- 1 value (type: string)
+ 0 key (type: string)
+ 1 key (type: string)
Local Work:
Map Reduce Local Work
- Map 2
+ Map 4
Map Operator Tree:
TableScan
- alias: t2
+ alias: t3
Statistics: Num rows: 500 Data size: 5312 Basic stats: COMPLETE Column stats: NONE
Filter Operator
- predicate: (key is not null and value is not null) (type: boolean)
- Statistics: Num rows: 125 Data size: 1328 Basic stats: COMPLETE Column stats: NONE
+ predicate: value is not null (type: boolean)
+ Statistics: Num rows: 250 Data size: 2656 Basic stats: COMPLETE Column stats: NONE
Spark HashTable Sink Operator
condition expressions:
0
- 1 {value}
+ 1 {key} {BLOCK__OFFSET__INSIDE__FILE}
keys:
- 0 key (type: string)
- 1 key (type: string)
+ 0 _col6 (type: string)
+ 1 value (type: string)
Local Work:
Map Reduce Local Work
Stage: Stage-1
Spark
Edges:
- Reducer 4 <- Map 3 (SORT, 1)
+ Reducer 2 <- Map 1 (SORT, 1)
#### A masked pattern was here ####
Vertices:
- Map 3
+ Map 1
Map Operator Tree:
TableScan
alias: t1
@@ -80,7 +80,7 @@ STAGE PLANS:
1 key (type: string)
outputColumnNames: _col6
input vertices:
- 1 Map 2
+ 1 Map 3
Statistics: Num rows: 275 Data size: 2921 Basic stats: COMPLETE Column stats: NONE
Map Join Operator
condition map:
@@ -93,7 +93,7 @@ STAGE PLANS:
1 value (type: string)
outputColumnNames: _col10, _col11, _col12
input vertices:
- 1 Map 1
+ 1 Map 4
Statistics: Num rows: 302 Data size: 3213 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col12 (type: bigint), _col10 (type: string), _col11 (type: string)
@@ -105,7 +105,7 @@ STAGE PLANS:
Statistics: Num rows: 302 Data size: 3213 Basic stats: COMPLETE Column stats: NONE
Local Work:
Map Reduce Local Work
- Reducer 4
+ Reducer 2
Reduce Operator Tree:
Select Operator
expressions: KEY.reducesinkkey0 (type: bigint), KEY.reducesinkkey1 (type: string), KEY.reducesinkkey2 (type: string)
@@ -157,7 +157,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Map Operator Tree:
TableScan
alias: t2
@@ -179,7 +179,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 1
Map Operator Tree:
TableScan
alias: t1
@@ -198,7 +198,7 @@ STAGE PLANS:
1 key (type: string)
outputColumnNames: _col7
input vertices:
- 1 Map 1
+ 1 Map 2
Statistics: Num rows: 182 Data size: 1939 Basic stats: COMPLETE Column stats: NONE
Select Operator
expressions: _col7 (type: bigint)
Modified: hive/branches/spark/ql/src/test/results/clientpositive/spark/join_view.q.out
URL: http://svn.apache.org/viewvc/hive/branches/spark/ql/src/test/results/clientpositive/spark/join_view.q.out?rev=1642395&r1=1642394&r2=1642395&view=diff
==============================================================================
--- hive/branches/spark/ql/src/test/results/clientpositive/spark/join_view.q.out (original)
+++ hive/branches/spark/ql/src/test/results/clientpositive/spark/join_view.q.out Sat Nov 29 03:44:22 2014
@@ -52,7 +52,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 1
+ Map 2
Local Work:
Map Reduce Local Work
@@ -60,7 +60,7 @@ STAGE PLANS:
Spark
#### A masked pattern was here ####
Vertices:
- Map 2
+ Map 1
Local Work:
Map Reduce Local Work