You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by wxhsdp <wx...@gmail.com> on 2014/04/29 01:46:28 UTC

Re: how to declare tuple return type

you need to import org.apache.spark.rdd.RDD to include RDD.
http://spark.apache.org/docs/latest/api/core/index.html#org.apache.spark.rdd.RDD

here are some examples you can learn
https://github.com/apache/spark/tree/master/mllib/src/main/scala/org/apache/spark/mllib



SK wrote
> I am a new user of Spark. I have a class that defines a function as
> follows. It returns a tuple : (Int, Int, Int).
> 
> class Sim extends VectorSim {
> override def  input(master:String): (Int,Int,Int) = {
>             sc = new SparkContext(master, "Test")
>             val ratings = sc.textFile(INP_FILE)
>                       .map(line=> {
>                         val fields = line.split("\t")
>                         (fields(0).toInt, fields(1).toInt,
> fields(2).toInt)
>                       })
>             ratings
>       }
> }
> 
> The class extends the trait VectorSim where the function  input() is
> declared as follows.
> 
> trait VectorSim {
>   def input (s:String): (Int, Int, Int)
> }
> 
> However, when I compile, I get a type mismatch saying input() returns
> RDD[(Int,Int,Int)].
> So I changed the return type to RDD[(Int,Int,Int)], but the compiler
> complains that there is no type called RDD. What is the right way to
>  declare the return type for a tuple that is (Int,Int,Int).
> 
> I am using spark 0.9.
> 
> thanks





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-declare-tuple-return-type-tp4985p4993.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.