You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/05/12 04:12:00 UTC

[jira] [Created] (SPARK-35382) Fix lambda variable name issues in nested DataFrame functions in Python APIs

Hyukjin Kwon created SPARK-35382:
------------------------------------

             Summary: Fix lambda variable name issues in nested DataFrame functions in Python APIs
                 Key: SPARK-35382
                 URL: https://issues.apache.org/jira/browse/SPARK-35382
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 3.1.1
            Reporter: Hyukjin Kwon


Python side also has the same issue as SPARK-34794

{code}
from pyspark.sql.functions import *
df = sql("SELECT array(1, 2, 3) as numbers, array('a', 'b', 'c') as letters")
df.select(
    transform(
        "numbers",
        lambda n: transform("letters", lambda l: struct(n.alias("n"), l.alias("l")))
    )
).show()
{code}

{code}
+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|transform(numbers, lambdafunction(transform(letters, lambdafunction(struct(namedlambdavariable() AS n, namedlambdavariable() AS l), namedlambdavariable())), namedlambdavariable()))|
+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
|                                                                                                                                                                [[{a, a}, {b, b},...|
+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org