You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by satish chandra j <js...@gmail.com> on 2015/10/16 16:41:35 UTC
Convert SchemaRDD to RDD
Hi All,
To convert SchemaRDD to RDD below snipped is working if SQL statement has
columns in a row are less than 22 as per tuple restriction
rdd.map(row => row.toString)
But if SQL statement has columns more than 22 than the above snippet will
error "*object Tuple27 is not a member of package scala*"
Could anybody please provide inputs to convert SchemaRDD to RDD without
using Tuple in the implementation approach
Thanks for your valuable inputs in advance
Regards,
Satish Chandra
Re: Convert SchemaRDD to RDD
Posted by Ted Yu <yu...@gmail.com>.
bq. type mismatch found String required Serializable
See line 110:
http://grepcode.com/file/repository.grepcode.com/java/root/jdk/openjdk/6-b14/java/lang/String.java#109
Can you pastebin the complete stack trace for the error you encountered ?
Cheers
On Fri, Oct 16, 2015 at 8:01 AM, satish chandra j <js...@gmail.com>
wrote:
> HI Ted,
> I have implemented the below snipped but getting an error"type mismatch
> found String required Serializable" as mentioned in mail chain
>
> class MyRecord(val val1: String, val val2: String, ... more then 22,
> in this case f.e. 26)
> extends Product with Serializable {
>
> def canEqual(that: Any): Boolean = that.isInstanceOf[MyRecord]
>
> def productArity: Int = 26 // example value, it is amount of arguments
>
> def productElement(n: Int): Serializable = n match {
> case 1 => val1
> case 2 => val2
> //... cases up to 26
> }
> }
>
> hence expecting an approach to convert SchemaRDD to RDD without using
> Tuple or Case Class as we have restrictions in Scala 2.10
>
> Regards
> Satish Chandra
>
Re: Convert SchemaRDD to RDD
Posted by satish chandra j <js...@gmail.com>.
HI Ted,
I have implemented the below snipped but getting an error"type mismatch
found String required Serializable" as mentioned in mail chain
class MyRecord(val val1: String, val val2: String, ... more then 22,
in this case f.e. 26)
extends Product with Serializable {
def canEqual(that: Any): Boolean = that.isInstanceOf[MyRecord]
def productArity: Int = 26 // example value, it is amount of arguments
def productElement(n: Int): Serializable = n match {
case 1 => val1
case 2 => val2
//... cases up to 26
}
}
hence expecting an approach to convert SchemaRDD to RDD without using Tuple
or Case Class as we have restrictions in Scala 2.10
Regards
Satish Chandra
Re: Convert SchemaRDD to RDD
Posted by Ted Yu <yu...@gmail.com>.
Have you seen this thread ?
http://search-hadoop.com/m/q3RTt9YBFr17u8j8&subj=Scala+Limitation+Case+Class+definition+with+more+than+22+arguments
On Fri, Oct 16, 2015 at 7:41 AM, satish chandra j <js...@gmail.com>
wrote:
> Hi All,
> To convert SchemaRDD to RDD below snipped is working if SQL statement has
> columns in a row are less than 22 as per tuple restriction
>
> rdd.map(row => row.toString)
>
> But if SQL statement has columns more than 22 than the above snippet will
> error "*object Tuple27 is not a member of package scala*"
>
> Could anybody please provide inputs to convert SchemaRDD to RDD without
> using Tuple in the implementation approach
>
> Thanks for your valuable inputs in advance
>
> Regards,
> Satish Chandra
>