You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Feynman Liang (JIRA)" <ji...@apache.org> on 2015/08/30 02:18:45 UTC

[jira] [Updated] (SPARK-10352) Replace SQLTestData internal usages of String with UTF8String

     [ https://issues.apache.org/jira/browse/SPARK-10352?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Feynman Liang updated SPARK-10352:
----------------------------------
    Summary: Replace SQLTestData internal usages of String with UTF8String  (was: Replace internal usages of String with UTF8String)

> Replace SQLTestData internal usages of String with UTF8String
> -------------------------------------------------------------
>
>                 Key: SPARK-10352
>                 URL: https://issues.apache.org/jira/browse/SPARK-10352
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Feynman Liang
>
> Running the code:
> {code}
>     val inputString = "abc"
>     val row = InternalRow.apply(inputString)
>     val unsafeRow = UnsafeProjection.create(Array[DataType](StringType)).apply(row)
> {code}
> generates the error:
> {code}
> [info]   java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
> [info]   at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow$class.getUTF8String(rows.scala:46)
> ***snip***
> {code}
> Although {{StringType}} should in theory only have internal type {{UTF8String}}, we [are inconsistent with this constraint|https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala#L131] and being more strict would [break existing code|https://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/test/SQLTestData.scala#L41] 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org