You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiangrui Meng (JIRA)" <ji...@apache.org> on 2015/07/02 19:00:06 UTC
[jira] [Resolved] (SPARK-8647) Potential issues with the constant
hashCode
[ https://issues.apache.org/jira/browse/SPARK-8647?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Xiangrui Meng resolved SPARK-8647.
----------------------------------
Resolution: Fixed
Fix Version/s: 1.5.0
Issue resolved by pull request 7146
[https://github.com/apache/spark/pull/7146]
> Potential issues with the constant hashCode
> --------------------------------------------
>
> Key: SPARK-8647
> URL: https://issues.apache.org/jira/browse/SPARK-8647
> Project: Spark
> Issue Type: Improvement
> Components: MLlib
> Affects Versions: 1.4.0
> Reporter: Alok Singh
> Assignee: Alok Singh
> Priority: Minor
> Labels: performance
> Fix For: 1.5.0
>
>
> Hi,
> This may be potential bug or performance issue or just the code docs.
> The issue is wrt to MatrixUDT class.
> If we decide to put instance of MatrixUDT into the hash based collection.
> The hashCode function is returning constant and even though equals method is consistant with hashCode. I don't see the reason why hashCode() = 1994 (i.e constant) has been used.
> I was expecting it to be similar to the other matrix class or the vector class .
> If there is the reason why we have this code, we should document it properly in the code so that others reading it is fine.
> regards,
> Alok
> Details
> =====
> a)
> In reference to the file
> https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/mllib/linalg/Matrices.scala
> line 188-197 ie
> override def equals(o: Any): Boolean = {
> o match {
> case v: MatrixUDT => true
> case _ => false
> }
> }
> override def hashCode(): Int = 1994
> b) the commit is
> https://github.com/apache/spark/commit/11e025956be3818c00effef0d650734f8feeb436
> on March 20.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org