You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Laeeq Ahmed <la...@yahoo.com.INVALID> on 2015/03/06 20:06:23 UTC
Help with transformWith in SparkStreaming
Hi,
I am filtering first DStream with the value in second DStream. I also want to keep the value of second Dstream. I have done the following and having problem with returning new RDD:
val transformedFileAndTime = fileAndTime.transformWith(anomaly, (rdd1: RDD[(String,String)], rdd2 : RDD[Int]) => { var first = " "; var second = " "; var third = 0 if (rdd2.first<=3) { first = rdd1.map(_._1).first second = rdd1.map(_._2).first third = rdd2.first } RDD[(first,second,third)] })
ERROR/home/hduser/Projects/scalaad/src/main/scala/eeg/anomd/StreamAnomalyDetector.scala:119: error: not found: value RDD[ERROR] RDD[(first,second,third)]
I am imported the import org.apache.spark.rdd.RDD
Regards,Laeeq