You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/12/29 12:06:00 UTC

[jira] [Created] (SPARK-41770) eqNullSafe does not support None as its argument

Hyukjin Kwon created SPARK-41770:
------------------------------------

             Summary: eqNullSafe does not support None as its argument
                 Key: SPARK-41770
                 URL: https://issues.apache.org/jira/browse/SPARK-41770
             Project: Spark
          Issue Type: Sub-task
          Components: Connect
    Affects Versions: 3.4.0
            Reporter: Hyukjin Kwon


{code}
**********************************************************************
File "/.../spark/python/pyspark/sql/connect/column.py", line 90, in pyspark.sql.connect.column.Column.eqNullSafe
Failed example:
    df1.select(
        df1['value'] == 'foo',
        df1['value'].eqNullSafe('foo'),
        df1['value'].eqNullSafe(None)
    ).show()
Exception raised:
    Traceback (most recent call last):
      File "/.../miniconda3/envs/python3.9/lib/python3.9/doctest.py", line 1336, in __run
        exec(compile(example.source, filename, "single",
      File "<doctest pyspark.sql.connect.column.Column.eqNullSafe[2]>", line 4, in <module>
        df1['value'].eqNullSafe(None)
      File "/.../workspace/forked/spark/python/pyspark/sql/connect/column.py", line 78, in wrapped
        return scalar_function(name, self, other)
      File "/.../workspace/forked/spark/python/pyspark/sql/connect/column.py", line 95, in scalar_function
        return Column(UnresolvedFunction(op, [arg._expr for arg in args]))
      File "/.../workspace/forked/spark/python/pyspark/sql/connect/column.py", line 95, in <listcomp>
        return Column(UnresolvedFunction(op, [arg._expr for arg in args]))
    AttributeError: 'NoneType' object has no attribute '_expr'
{code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org