You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/09/10 20:08:00 UTC

[jira] [Assigned] (SPARK-25398) Minor bugs from comparing unrelated types

     [ https://issues.apache.org/jira/browse/SPARK-25398?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-25398:
------------------------------------

    Assignee: Sean Owen  (was: Apache Spark)

> Minor bugs from comparing unrelated types
> -----------------------------------------
>
>                 Key: SPARK-25398
>                 URL: https://issues.apache.org/jira/browse/SPARK-25398
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos, Spark Core, YARN
>    Affects Versions: 2.3.1
>            Reporter: Sean Owen
>            Assignee: Sean Owen
>            Priority: Minor
>
> I noticed a potential issue from Scala inspections, like this clause in LiveEntity.scala around line 586:
> {code:java}
>  (!acc.metadata.isDefined ||
>   acc.metadata.get != Some(AccumulatorContext.SQL_ACCUM_IDENTIFIER)){code}
> The issue is that acc.metadata is Option[String], so can't equal Some[String]. This just meant to be:
> {code:java}
>  acc.metadata != Some(AccumulatorContext.SQL_ACCUM_IDENTIFIER){code}
> This may or may not actually cause a bug, but seems worth fixing. And then there are a number of other ones like this, mostly in tests, that might likewise mask real assertion problems.
> Many are, interestingly, flagging items like this on a Seq[String]:
> {code:java}
> .filter(_.getFoo.equals("foo")){code}
> It complains that Any => Any is compared to String. Either it's wrong, or somehow, this is parsed as (_.getFoo).equals("foo")). In any event, easy enough to write this more clearly as:
> {code:java}
> .filter(_.getFoo == "foo"){code}
> And so on.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org