You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Arun Jijo (Jira)" <ji...@apache.org> on 2020/01/27 06:20:00 UTC
[jira] [Created] (SPARK-30646) transform_keys function throws
exception as "Cannot use null as map key", but there isn't any null key in
the map
Arun Jijo created SPARK-30646:
---------------------------------
Summary: transform_keys function throws exception as "Cannot use null as map key", but there isn't any null key in the map
Key: SPARK-30646
URL: https://issues.apache.org/jira/browse/SPARK-30646
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 3.0.0
Reporter: Arun Jijo
Have started experimenting Spark 3.0 new SQL functions and along the way found an issue with the *transform_keys* function. It is raising "Cannot use null as map key" exception but the Map actually doesn't hold any Null values.
Find my spark code below to reproduce the error.
{code:java}
val df = Seq(Map("EID_1"->10000,"EID_2"->25000)).toDF("employees")
df.withColumn("employees",transform_keys($"employees",(k,v)=>lit(k.+("XYX"))))
.show
{code}
Exception in thread "main" java.lang.RuntimeException: *Cannot use null as map key*.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org