You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Matthew Farrellee (JIRA)" <ji...@apache.org> on 2014/09/18 16:36:33 UTC

[jira] [Commented] (SPARK-3581) RDD API(distinct/subtract) does not work for RDD of Dictionaries

    [ https://issues.apache.org/jira/browse/SPARK-3581?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14139000#comment-14139000 ] 

Matthew Farrellee commented on SPARK-3581:
------------------------------------------

a dict is mutable and therefore doesn't have a hash representation in python. the same is true for lists.

{code}
>>> sc.parallelize(([1,],[1,])).distinct().collect()
...
TypeError: unhashable type: 'list'
...
{code}

if you agree, i'll close this as not a bug.

> RDD API(distinct/subtract) does not work for RDD of Dictionaries
> ----------------------------------------------------------------
>
>                 Key: SPARK-3581
>                 URL: https://issues.apache.org/jira/browse/SPARK-3581
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.0.0, 1.0.2, 1.1.0
>         Environment: Spark 1.0 1.1
> JDK 1.6
>            Reporter: Shawn Guo
>            Priority: Minor
>
> Construct a RDD of dictionaries(dictRDD), 
> try to use the RDD API, RDD.distinct() or RDD.subtract().
> {code:title=PySpark RDD API Test|borderStyle=solid}
> dictRDD = sc.parallelize(({'MOVIE_ID': 1, 'MOVIE_NAME': 'Lord of the Rings','MOVIE_DIRECTOR': 'Peter Jackson'},{'MOVIE_ID': 2, 'MOVIE_NAME': 'King King', 'MOVIE_DIRECTOR': 'Peter Jackson'},{'MOVIE_ID': 2, 'MOVIE_NAME': 'King King', 'MOVIE_DIRECTOR': 'Peter Jackson'}))
> dictRDD.distinct().collect()
> dictRDD.subtract(dictRDD).collect()
> {code}
> An error occurred while calling, 
> TypeError: unhashable type: 'dict'
> I'm not sure if it is a bug or expected results.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org