You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Frank Oosterhuis (Jira)" <ji...@apache.org> on 2020/06/22 12:16:00 UTC

[jira] [Updated] (SPARK-32051) Dataset.foreachPartition returns object

     [ https://issues.apache.org/jira/browse/SPARK-32051?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Frank Oosterhuis updated SPARK-32051:
-------------------------------------
    Description: 
I'm trying to map values from the Dataset[Row], but since 3.0.0 this fails.

In 3.0.0 I'm dealing with an error: "Error:(28, 38) value map is not a member of Object"

 

This is the simplest code that works in 2.4.x, but fails in 3.0.0:
{code:scala}
spark.range(100)
  .repartition(10)
  .foreachPartition(part => println(part.toList))
{code}

  was:
I'm trying to map values from the Dataset[Row], but since 3.0.0 this fails.

In 3.0.0 I'm dealing with an error: "Error:(28, 38) value map is not a member of Object"

 

I've made a little sample project with code that works in 2.4.x, but fails in 3.0.0:

https://github.com/frankivo/partitions/blob/master/src/main/scala/com/github/frankivo/partitions.scala


> Dataset.foreachPartition returns object
> ---------------------------------------
>
>                 Key: SPARK-32051
>                 URL: https://issues.apache.org/jira/browse/SPARK-32051
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Frank Oosterhuis
>            Priority: Critical
>
> I'm trying to map values from the Dataset[Row], but since 3.0.0 this fails.
> In 3.0.0 I'm dealing with an error: "Error:(28, 38) value map is not a member of Object"
>  
> This is the simplest code that works in 2.4.x, but fails in 3.0.0:
> {code:scala}
> spark.range(100)
>   .repartition(10)
>   .foreachPartition(part => println(part.toList))
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org