You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:34:22 UTC

[jira] [Resolved] (SPARK-14751) SparkR fails on Cassandra map with numeric key

     [ https://issues.apache.org/jira/browse/SPARK-14751?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-14751.
----------------------------------
    Resolution: Incomplete

> SparkR fails on Cassandra map with numeric key
> ----------------------------------------------
>
>                 Key: SPARK-14751
>                 URL: https://issues.apache.org/jira/browse/SPARK-14751
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 1.6.1
>            Reporter: Michał Matłoka
>            Priority: Major
>              Labels: bulk-closed
>
> Hi,
> I have created an issue for spark  cassandra connector ( https://datastax-oss.atlassian.net/projects/SPARKC/issues/SPARKC-366 ) but after a bit of digging it seems this is a better place for this issue:
> {code}
> CREATE TABLE test.map (
>     id text,
>     somemap map<tinyint, decimal>,
>     PRIMARY KEY (id)
> );
> insert into test.map(id, somemap) values ('a', { 0 : 12 }); 
> {code}
> {code}
>   sqlContext <- sparkRSQL.init(sc)
>   test <-read.df(sqlContext,  source = "org.apache.spark.sql.cassandra",  keyspace = "test", table = "map")
>   head(test)
> {code}
> Results in:
> {code}
> 16/04/19 14:47:02 ERROR RBackendHandler: dfToCols on org.apache.spark.sql.api.r.SQLUtils failed
> Error in readBin(con, raw(), stringLen, endian = "big") :
>   invalid 'n' argument
> {code}
> Problem occurs even for int key. For text key it works. Every scenario works under scala & python.
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org