You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/07/17 02:52:00 UTC

[jira] [Created] (SPARK-21432) Reviving broken partial functions in UDF in PySpark

Hyukjin Kwon created SPARK-21432:
------------------------------------

             Summary: Reviving broken partial functions in UDF in PySpark
                 Key: SPARK-21432
                 URL: https://issues.apache.org/jira/browse/SPARK-21432
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 2.3.0
            Reporter: Hyukjin Kwon


This is related with SPARK-21394

We also happened to break partial function support in UDF.

Spark 2.1:

{code}
>>> from pyspark.sql import functions
>>> from functools import partial
>>>
>>>
>>> partial_func = partial(lambda x: x, x=1)
>>> udf = functions.udf(partial_func)
>>> spark.range(1).select(udf()).show()
+---------+
|partial()|
+---------+
|        1|
+---------+
{code}

master:

{code}
>>> from pyspark.sql import functions
>>> from functools import partial
>>>
>>>
>>> partial_func = partial(lambda x: x, x=1)
>>> udf = functions.udf(partial_func)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File ".../spark/python/pyspark/sql/functions.py", line 2154, in udf
    return _udf(f=f, returnType=returnType)
  File ".../spark/python/pyspark/sql/functions.py", line 2145, in _udf
    return udf_obj._wrapped()
  File ".../spark/python/pyspark/sql/functions.py", line 2099, in _wrapped
    @functools.wraps(self.func, assigned=assignments)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/functools.py", line 33, in update_wrapper
    setattr(wrapper, attr, getattr(wrapped, attr))
AttributeError: 'functools.partial' object has no attribute '__module__'
{code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org