You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2014/11/18 00:25:33 UTC

[jira] [Updated] (SPARK-2449) Spark sql reflection code requires a constructor taking all the fields for the table

     [ https://issues.apache.org/jira/browse/SPARK-2449?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michael Armbrust updated SPARK-2449:
------------------------------------
    Target Version/s: 1.3.0  (was: 1.2.0)

> Spark sql reflection code requires a constructor taking all the fields for the table
> ------------------------------------------------------------------------------------
>
>                 Key: SPARK-2449
>                 URL: https://issues.apache.org/jira/browse/SPARK-2449
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 1.0.0
>            Reporter: Ian O Connell
>
> The reflection code does a lookup for the fields passed to the constructor to make the types for the table. Specifically the code:
>       val params = t.member(nme.CONSTRUCTOR).asMethod.paramss
> in ScalaReflection.scala
> Simple repo case from the spark shell:
> trait PersonTrait extends Product
> case class Person(a: Int) extends PersonTrait
> val l: List[PersonTrait] = List(1, 2, 3, 4).map(Person(_))
> val sqlContext = new org.apache.spark.sql.SQLContext(sc)
> import sqlContext._
> sc.parallelize(l).registerAsTable("people")
> scala> sc.parallelize(l).registerAsTable("people")
> scala.ScalaReflectionException: <none> is not a method
>   at scala.reflect.api.Symbols$SymbolApi$class.asMethod(Symbols.scala:279)
>   at scala.reflect.internal.Symbols$SymbolContextApiImpl.asMethod(Symbols.scala:73)
>   at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:52)
>   at 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org