You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rafael (Jira)" <ji...@apache.org> on 2020/08/14 19:07:00 UTC

[jira] [Commented] (SPARK-26132) Remove support for Scala 2.11 in Spark 3.0.0

    [ https://issues.apache.org/jira/browse/SPARK-26132?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17178019#comment-17178019 ] 

Rafael commented on SPARK-26132:
--------------------------------

[~srowen]

In release notes for Spark 3.0.0 they mentioned your ticket
{quote}Due to the upgrade of Scala 2.12, {{DataStreamWriter.foreachBatch}} is not source compatible for Scala program. You need to update your Scala source code to disambiguate between Scala function and Java lambda. (SPARK-26132)
{quote}
 

so maybe you know how we should use *foreachPartition* now in Scala code
{code:java}
dataFrame.foreachPartition(partition => {
  partition
    .grouped(Config.BATCH_SIZE)
    .foreach(batch => { 
     ....
     } 
}
{code}
Right now it call on any method like grouped, foreach cause the exception 
*value grouped is not a member of Object*

> Remove support for Scala 2.11 in Spark 3.0.0
> --------------------------------------------
>
>                 Key: SPARK-26132
>                 URL: https://issues.apache.org/jira/browse/SPARK-26132
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build, Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Sean R. Owen
>            Assignee: Sean R. Owen
>            Priority: Major
>              Labels: release-notes
>             Fix For: 3.0.0
>
>
> Per some discussion on the mailing list, we are_considering_ formally not supporting Scala 2.11 in Spark 3.0. This JIRA tracks that discussion.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org