You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2015/09/16 01:50:45 UTC
[jira] [Commented] (SPARK-10508) incorrect evaluation of searched
case expression
[ https://issues.apache.org/jira/browse/SPARK-10508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14746515#comment-14746515 ]
Josh Rosen commented on SPARK-10508:
------------------------------------
I managed to reproduce this on 1.3.1 as well, using the following code:
{code}
val df = Seq(
(0, null.asInstanceOf[Double]),
(1, -1.0),
(2, 0.0),
(3, 1.0),
(4, 0.1),
(5, 10.0)
).toDF("rnum", "cdec").selectExpr("rnum", "cast(cdec as decimal(7, 2)) as cdec")
df.registerTempTable("TDEC")
sqlContext.sql("select rnum, cdec, case when cdec in ( -1,10,0.1 ) then 'test1' else 'other' end from tdec")
{code}
I was able to confirm that this is fixed in at least 1.4.1 and 1.5.0, where this gives the following result:
{code}
0 0 other
1 -1 test1
2 0 other
3 1 other
4 0.1 test1
5 10 test1
{code}
As a result, you should be able to work around this by upgrading to a newer Spark version. Therefore, I'm going to mark this as fixed in 1.4.1 / 1.5.0.
> incorrect evaluation of searched case expression
> ------------------------------------------------
>
> Key: SPARK-10508
> URL: https://issues.apache.org/jira/browse/SPARK-10508
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.3.1
> Reporter: N Campbell
> Fix For: 1.4.1, 1.5.0
>
>
> The following case expression never evaluates to 'test1' when cdec is -1 or 10 as it will for Hive 0.13. Instead is returns 'other' for all rows.
> {code}
> select rnum, cdec, case when cdec in ( -1,10,0.1 ) then 'test1' else 'other' end from tdec
> create table if not exists TDEC ( RNUM int , CDEC decimal(7, 2 ) )
> TERMINATED BY '\n'
> STORED AS orc ;
> 0|\N
> 1|-1.00
> 2|0.00
> 3|1.00
> 4|0.10
> 5|10.00
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org