You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/25 21:34:39 UTC

[GitHub] HeartSaVioR opened a new pull request #23891: [SPARK-26987][SQL] Add a new method to RowFactory: Row with schema

HeartSaVioR opened a new pull request #23891: [SPARK-26987][SQL] Add a new method to RowFactory: Row with schema
URL: https://github.com/apache/spark/pull/23891
 
 
   ## What changes were proposed in this pull request?
   
   This patch proposes to expose an official approach for Java API to create a Row with schema.
   
   ## How was this patch tested?
   
   This only exposes the way to leverage existing class in official API, hence no UT added.
   Manually tested against below query:
   
   ```
   SparkSession sparkSession = SparkSession.builder().master("local").getOrCreate();
   StructType inSchema = DataTypes.createStructType(
           new StructField[] {
                   DataTypes.createStructField("id", DataTypes.StringType      , false),
                   DataTypes.createStructField("ts", DataTypes.TimestampType   , false),
                   DataTypes.createStructField("f1", DataTypes.LongType        , true)
           }
   );
   Dataset<Row> rawSet = sparkSession.sqlContext().readStream()
           .format("rate")
           .option("rowsPerSecond", 1)
           .load()
           .map(   (MapFunction<Row, Row>) raw -> {
                       Object[] fields = new Object[3];
                       fields[0] = "id1";
                       fields[1] = raw.getAs("timestamp");
                       fields[2] = raw.getAs("value");
                       return RowFactory.createWithSchema(inSchema, fields);
                   },
                   RowEncoder.apply(inSchema)
           )
           .filter((FilterFunction<Row>) row -> !row.getAs("f1").equals(0L))
           .withWatermark("ts", "10 seconds");
   
   StreamingQuery streamingQuery = rawSet
           .select("*")
           .writeStream()
           .format("console")
           .outputMode("append")
           .start();
   
   try {
       streamingQuery.awaitTermination(30_000);
   } catch (StreamingQueryException e) {
       System.out.println("Caught exception at 'awaitTermination':");
       e.printStackTrace();
   }
   ```
   
   If we change the map function to `return RowFactory.create(fields);`, filter function fails.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org