You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Davies Liu (JIRA)" <ji...@apache.org> on 2015/07/15 18:49:05 UTC

[jira] [Resolved] (SPARK-8840) Float type coercion with hiveContext

     [ https://issues.apache.org/jira/browse/SPARK-8840?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Davies Liu resolved SPARK-8840.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 1.5.0

Issue resolved by pull request 7280
[https://github.com/apache/spark/pull/7280]

> Float type coercion with hiveContext
> ------------------------------------
>
>                 Key: SPARK-8840
>                 URL: https://issues.apache.org/jira/browse/SPARK-8840
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 1.4.0
>            Reporter: Evgeny SInelnikov
>             Fix For: 1.5.0
>
>
> Problem with +float+ type coercion on SparkR with hiveContext.
> {code}
> > result <- sql(hiveContext, "SELECT offset, percentage from data limit 100")
> > show(result)
> DataFrame[offset:float, percentage:float]
> > head(result)
> Error in as.data.frame.default(x[[i]], optional = TRUE) :
>     cannot coerce class ""jobj"" to a data.frame
> {code}
> This trouble looks like already exists (SPARK-2863 - Emulate Hive type
> coercion in native reimplementations of Hive functions) with same
> reason - not completed "native reimplementations of Hive..." not
> "...functions" only.
> I used spark 1.4.0 binaries from official site:
> http://spark.apache.org/downloads.html
> And running it on:
> * Hortonworks HDP 2.2.0.0-2041
> * with Hive 0.14
> * with disabled hooks for Application Timeline Servers (ATSHook) in hive-site.xml, commented:
> ** hive.exec.failure.hooks,
> ** hive.exec.post.hooks,
> ** hive.exec.pre.hooks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org