You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Praveen Srivastava <pr...@oracle.com> on 2019/12/29 12:15:21 UTC

Unsubscribe

 

 

From: Jean-Georges Perrin <jg...@jgp.net> 
Sent: Saturday, December 28, 2019 11:08 PM
To: dev <de...@spark.apache.org>
Subject: Issue with map Java lambda function with 3.0.0 preview and preview 2

 

Hey guys,

 

This code:

 

    Dataset<Row> incrementalDf = spark

        .createDataset(l, Encoders.INT())

        .toDF();

    Dataset<Integer> dotsDs = incrementalDf

        .map(status -> {

          double x = Math.random() * 2 - 1;

          double y = Math.random() * 2 - 1;

          counter++;

          if (counter % 100000 == 0) {

            System.out.println("" + counter + " darts thrown so far");

          }

          return (x * x + y * y <= 1) ? 1 : 0;

        }, Encoders.INT());

 

used to work with Spark 2.x, in the two previous, it says:

 

The method map(Function1<Row,Integer>, Encoder<Integer>) is ambiguous for the type Dataset<Row>

 

IfI define my mapping function as a class it works fine. Here is the class:

 

  private final class DartMapper

      implements MapFunction<Row, Integer> {

    private static final long serialVersionUID = 38446L;

 

    @Override

    public Integer call(Row r) throws Exception {

      double x = Math.random() * 2 - 1;

      double y = Math.random() * 2 - 1;

      counter++;

      if (counter % 1000 == 0) {

        System.out.println("" + counter + " operations done so far");

      }

      return (x * x + y * y <= 1) ? 1 : 0;

    }

  }

 

Any hint on what/if I did wrong? 

 

jg

 

 

 

Unsubscribe

Posted by abel palaty <pa...@gmail.com>.