You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/08/14 16:34:20 UTC
[jira] [Commented] (SPARK-17052) Remove Duplicate Test Cases
auto_join from HiveCompatibilitySuite.scala
[ https://issues.apache.org/jira/browse/SPARK-17052?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15420385#comment-15420385 ]
Apache Spark commented on SPARK-17052:
--------------------------------------
User 'gatorsmile' has created a pull request for this issue:
https://github.com/apache/spark/pull/14635
> Remove Duplicate Test Cases auto_join from HiveCompatibilitySuite.scala
> -----------------------------------------------------------------------
>
> Key: SPARK-17052
> URL: https://issues.apache.org/jira/browse/SPARK-17052
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Xiao Li
> Priority: Minor
>
> The original [JIRA Hive-1642](https://issues.apache.org/jira/browse/HIVE-1642) delivered the test cases `auto_joinXYZ` for verifying the results when the joins are automatically converted to map-join. Basically, most of them are just copied from the corresponding `joinXYZ`.
> After comparison between `auto_joinXYZ` and `joinXYZ`, below is a list of duplicate cases:
> {noformat}
> "auto_join0",
> "auto_join1",
> "auto_join10",
> "auto_join11",
> "auto_join12",
> "auto_join13",
> "auto_join14",
> "auto_join14_hadoop20",
> "auto_join15",
> "auto_join17",
> "auto_join18",
> "auto_join2",
> "auto_join20",
> "auto_join21",
> "auto_join23",
> "auto_join24",
> "auto_join3",
> "auto_join4",
> "auto_join5",
> "auto_join6",
> "auto_join7",
> "auto_join8",
> "auto_join9"
> {noformat}
> We can remove all of them without affecting the test coverage.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org