You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bjørn Jørgensen (Jira)" <ji...@apache.org> on 2022/12/29 18:02:00 UTC

[jira] [Created] (SPARK-41774) Remove def test_vectorized_udf_unsupported_types

Bjørn Jørgensen created SPARK-41774:
---------------------------------------

             Summary: Remove def test_vectorized_udf_unsupported_types
                 Key: SPARK-41774
                 URL: https://issues.apache.org/jira/browse/SPARK-41774
             Project: Spark
          Issue Type: Improvement
          Components: Pandas API on Spark
    Affects Versions: 3.4.0
            Reporter: Bjørn Jørgensen


https://github.com/apache/spark/blob/18488158beee5435f99899f99b2e90fb6e37f3d5/python/pyspark/sql/tests/pandas/test_pandas_udf_scalar.py#L603

{code:java}
def test_vectorized_udf_wrong_return_type(self):
        with QuietTest(self.sc):
            for udf_type in [PandasUDFType.SCALAR, PandasUDFType.SCALAR_ITER]:
                with self.assertRaisesRegex(
                    NotImplementedError,
                    "Invalid return type.*scalar Pandas UDF.*ArrayType.*TimestampType",
{code}

is the same code as 

https://github.com/apache/spark/blob/18488158beee5435f99899f99b2e90fb6e37f3d5/python/pyspark/sql/tests/pandas/test_pandas_udf_scalar.py#L679


{code:java}
def test_vectorized_udf_unsupported_types(self):
        with QuietTest(self.sc):
            for udf_type in [PandasUDFType.SCALAR, PandasUDFType.SCALAR_ITER]:
                with self.assertRaisesRegex(
                    NotImplementedError,
                    "Invalid return type.*scalar Pandas UDF.*ArrayType.*TimestampType",
                ):
                    pandas_udf(lambda x: x, ArrayType(TimestampType()), udf_type)
{code}


So we can remove one or fix the typo. 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org