You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/09/02 12:58:56 UTC

[GitHub] [spark] c27kwan commented on a diff in pull request #37771: [SPARK-40315][SQL] Add equals() and hashCode() to ArrayBasedMapData

c27kwan commented on code in PR #37771:
URL: https://github.com/apache/spark/pull/37771#discussion_r961653435


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ArrayBasedMapData.scala:
##########
@@ -35,6 +37,29 @@ class ArrayBasedMapData(val keyArray: ArrayData, val valueArray: ArrayData) exte
   override def toString: String = {
     s"keys: $keyArray, values: $valueArray"
   }
+
+  override def equals(obj: Any): Boolean = {
+    if (obj == null && this == null) {
+      return true
+    }
+
+    if (obj == null || !obj.isInstanceOf[ArrayBasedMapData]) {
+      return false
+    }
+
+    val other = obj.asInstanceOf[ArrayBasedMapData]
+
+    keyArray.equals(other.keyArray) && valueArray.equals(other.valueArray)
+  }
+
+  // Hash this class as a Product of two hashCodes. We don't know the DataType which prevents us
+  // from getting individual rows for hashing as a Map.
+  override def hashCode(): Int = {
+    val seed = MurmurHash3.productSeed
+    val keyHash = scala.util.hashing.MurmurHash3.mix(seed, keyArray.hashCode())
+    val valueHash = scala.util.hashing.MurmurHash3.mix(keyHash, valueArray.hashCode())
+    scala.util.hashing.MurmurHash3.finalizeHash(valueHash, 2)
+  }

Review Comment:
   Hi @cloud-fan , can you take a look? Do you think we need to also add an explicit `hashCode()` override to `ArrayData` (used by `keyArray` and `valueArray`)?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org