You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sandeep Singh (Jira)" <ji...@apache.org> on 2023/01/05 17:54:00 UTC

[jira] [Created] (SPARK-41905) Function `slice` should expect string in params

Sandeep Singh created SPARK-41905:
-------------------------------------

             Summary: Function `slice` should expect string in params
                 Key: SPARK-41905
                 URL: https://issues.apache.org/jira/browse/SPARK-41905
             Project: Spark
          Issue Type: Sub-task
          Components: Connect
    Affects Versions: 3.4.0
            Reporter: Sandeep Singh


{code:java}
from pyspark.sql import Window
from pyspark.sql.functions import nth_value

df = self.spark.createDataFrame(
    [
        ("a", 0, None),
        ("a", 1, "x"),
        ("a", 2, "y"),
        ("a", 3, "z"),
        ("a", 4, None),
        ("b", 1, None),
        ("b", 2, None),
    ],
    schema=("key", "order", "value"),
)
w = Window.partitionBy("key").orderBy("order")

rs = df.select(
    df.key,
    df.order,
    nth_value("value", 2).over(w),
    nth_value("value", 2, False).over(w),
    nth_value("value", 2, True).over(w),
).collect()

expected = [
    ("a", 0, None, None, None),
    ("a", 1, "x", "x", None),
    ("a", 2, "x", "x", "y"),
    ("a", 3, "x", "x", "y"),
    ("a", 4, "x", "x", "y"),
    ("b", 1, None, None, None),
    ("b", 2, None, None, None),
]

for r, ex in zip(sorted(rs), sorted(expected)):
    self.assertEqual(tuple(r), ex[: len(r)]){code}
{code:java}
Traceback (most recent call last):
  File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/tests/test_functions.py", line 755, in test_nth_value
    self.assertEqual(tuple(r), ex[: len(r)])
AssertionError: Tuples differ: ('a', 1, 'x', None) != ('a', 1, 'x', 'x')

First differing element 3:
None
'x'

- ('a', 1, 'x', None)
?               ^^^^

+ ('a', 1, 'x', 'x')
?               ^^^
 {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org