You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2015/07/14 23:37:04 UTC

[jira] [Updated] (SPARK-9045) Fix Scala 2.11 build break due in UnsafeExternalRowSorter

     [ https://issues.apache.org/jira/browse/SPARK-9045?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Rosen updated SPARK-9045:
------------------------------
    Affects Version/s: 1.5.0
     Target Version/s: 1.5.0

> Fix Scala 2.11 build break due in UnsafeExternalRowSorter
> ---------------------------------------------------------
>
>                 Key: SPARK-9045
>                 URL: https://issues.apache.org/jira/browse/SPARK-9045
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>            Priority: Blocker
>
> {code}
> [error] /home/jenkins/workspace/Spark-Master-Scala211-Compile/sql/catalyst/src/main/java/org/apache/spark/sql/execution/UnsafeExternalRowSorter.java:135: error: <anonymous org.apache.spark.sql.execution.UnsafeExternalRowSorter$1> is not abstract and does not override abstract method <B>minBy(Function1<InternalRow,B>,Ordering<B>) in TraversableOnce
> [error]       return new AbstractScalaRowIterator() {
> [error]                                             ^
> [error]   where B,A are type-variables:
> [error]     B extends Object declared in method <B>minBy(Function1<A,B>,Ordering<B>)
> [error]     A extends Object declared in interface TraversableOnce
> [error] 1 error
> [error] Compile failed at Jul 14, 2015 2:26:25 PM [26.443s]
> {code}
> It turns out that this can be fixed by making AbstractScalaRowIterator into a concrete class instead of an abstract class.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org