You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "dzhigimont (via GitHub)" <gi...@apache.org> on 2023/03/14 15:59:06 UTC

[GitHub] [spark] dzhigimont opened a new pull request, #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

dzhigimont opened a new pull request, #40420:
URL: https://github.com/apache/spark/pull/40420

   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
     8. If you want to add or modify an error type or message, please read the guideline first in
        'core/src/main/resources/error/README.md'.
   -->
   
   ### What changes were proposed in this pull request?
   <!--
   Please clarify what changes you are proposing. The purpose of this section is to outline the changes and how this PR fixes the issue. 
   If possible, please consider writing useful notes for better and faster reviews in your PR. See the examples below.
     1. If you refactor some codes with changing classes, showing the class hierarchy will help reviewers.
     2. If you fix some SQL features, you can provide some references of other DBMSes.
     3. If there is design documentation, please add the link.
     4. If there is a discussion in the mailing list, please add the link.
   -->
   Support `isocalendar` from the pandas 2.0.0
   
   ### Why are the changes needed?
   <!--
   Please clarify why the changes are needed. For instance,
     1. If you propose a new API, clarify the use case for a new API.
     2. If you fix a bug, you can clarify why it is a bug.
   -->
   When pandas 2.0.0 is released, we should match the behavior in pandas API on Spark.
   
   ### Does this PR introduce _any_ user-facing change?
   <!--
   Note that it means *any* user-facing change including all aspects such as the documentation fix.
   If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible.
   If possible, please also clarify if this is a user-facing change compared to the released Spark versions or within the unreleased branches such as master.
   If no, write 'No'.
   -->
   Added new method `DatetimeIndex.isocalendar` and removed two depreceted `DatetimeIndex.week` and `DatetimeIndex.weekofyear`
   ```
   dfs = ps.from_pandas(pd.date_range(start='2019-12-29', freq='D', periods=4).to_series())
   dfs.dt.isocalendar()
                       year  week  day
           2019-12-29  2019    52    7
           2019-12-30  2020     1    1
           2019-12-31  2020     1    2
           2020-01-01  2020     1    3
   dfs.dt.isocalendar().week
           2019-12-29    52
           2019-12-30     1
           2019-12-31     1
           2020-01-01     1
   ```
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some test cases that check the changes thoroughly including negative and positive cases if possible.
   If it was tested in a way different from regular unit tests, please clarify how you tested step by step, ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why it was difficult to add.
   If benchmark tests were added, please run the benchmarks in GitHub Actions for the consistent environment, and the instructions could accord to: https://spark.apache.org/developer-tools.html#github-workflow-benchmarks.
   -->
   UT was updated


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1332376664


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,59 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        .. note:: Returns have int64 type instead of UInt32 as is in pandas due to UInt32
+            is not supported by spark
+
+        Examples
+        --------

Review Comment:
   We can merge it before Spark 4.0.0, so we have enough time tho.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1136370080


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +118,50 @@ def pandas_microsecond(s) -> ps.Series[np.int64]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
-        """
-        The week ordinal of the year.
-
-        .. deprecated:: 3.4.0
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
+        Calculate year, week, and day according to the ISO 8601 standard.
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.

Review Comment:
   ```suggestion
               With columns year, week and day.
   
   ```



##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +118,50 @@ def pandas_microsecond(s) -> ps.Series[np.int64]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
-        """
-        The week ordinal of the year.
-
-        .. deprecated:: 3.4.0
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
+        Calculate year, week, and day according to the ISO 8601 standard.

Review Comment:
   ```suggestion
           Calculate year, week, and day according to the ISO 8601 standard.
   
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dzhigimont commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "dzhigimont (via GitHub)" <gi...@apache.org>.
dzhigimont commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1316944043


##########
python/pyspark/pandas/tests/indexes/test_datetime.py:
##########
@@ -269,6 +256,10 @@ def test_map(self):
         mapper_pser = pd.Series([1, 2, 3], index=pidx)
         self.assert_eq(psidx.map(mapper_pser), pidx.map(mapper_pser))
 
+    def test_isocalendar(self):

Review Comment:
   Moved to the appropriate test



##########
python/pyspark/pandas/tests/indexes/test_datetime.py:
##########
@@ -101,22 +102,8 @@ def test_properties(self):
                 self.assert_eq(psidx.day_of_week, pidx.day_of_week)
 
         if LooseVersion(pd.__version__) >= LooseVersion("2.0.0"):

Review Comment:
   Removed



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1332390021


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,59 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        .. note:: Returns have int64 type instead of UInt32 as is in pandas due to UInt32
+            is not supported by spark
+
+        Examples
+        --------
+        >>> dfs = ps.from_pandas(pd.date_range(start='2019-12-29', freq='D', periods=4).to_series())
+        >>> dfs.dt.isocalendar()
+                    year  week  day
+        2019-12-29  2019    52    7
+        2019-12-30  2020     1    1
+        2019-12-31  2020     1    2
+        2020-01-01  2020     1    3
+
+        >>> dfs.dt.isocalendar().week
+        2019-12-29    52
+        2019-12-30     1
+        2019-12-31     1
+        2020-01-01     1
+        Name: week, dtype: int64
+        """
+
+        return_types = [self._data.index.dtype, int, int, int]
+
+        def pandas_isocalendar(  # type: ignore[no-untyped-def]
+            pdf,
+        ) -> ps.DataFrame[return_types]:  # type: ignore[valid-type]

Review Comment:
   ```suggestion
           def pandas_isocalendar(
               pdf: pd.DataFrame,
           ) -> ps.DataFrame[Any]: 
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1136370343


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +118,50 @@ def pandas_microsecond(s) -> ps.Series[np.int64]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
-        """
-        The week ordinal of the year.
-
-        .. deprecated:: 3.4.0
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
+        Calculate year, week, and day according to the ISO 8601 standard.
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.

Review Comment:
   Otherwise, the HTML rendering of this doc is broken.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dzhigimont commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "dzhigimont (via GitHub)" <gi...@apache.org>.
dzhigimont commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1727680820

   > @dzhigimont Can we just make the CI pass for now? I can help in the follow-ups after merging this one.
   > 
   > Seems like the mypy checks is failing for now:
   > 
   > ```
   > starting mypy annotations test...
   > annotations failed mypy checks:
   > python/pyspark/pandas/namespace.py:162: error: Cannot assign multiple types to name "_range" without an explicit "Type[...]" annotation  [misc]
   > python/pyspark/pandas/indexes/base.py:2075: error: Unused "type: ignore" comment
   > python/pyspark/pandas/indexes/base.py:2145: error: Unused "type: ignore" comment
   > Found 3 errors in 2 files (checked 703 source files)
   > ```
   > 
   > To resolve them:
   > 
   > * [ ]  Remove "type: ignore" comment from python/pyspark/pandas/indexes/base.py:2075
   > * [ ]  Remove "type: ignore" comment from python/pyspark/pandas/indexes/base.py:2145
   > * [ ]  Add "# type: ignore" comment to python/pyspark/pandas/namespace.py:162
   
   I can't understand when I added Add "# type: ignore" comment to python/pyspark/pandas/namespace.py:162 mypy raised an error unused type: ignore when I deleted it raised   `python/pyspark/pandas/namespace.py:161: error: Cannot assign multiple types to name "_range" without an explicit "Type[...]" annotation  [misc]`
   what do you suggest to me? How i can resolve the problem
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dzhigimont commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "dzhigimont (via GitHub)" <gi...@apache.org>.
dzhigimont commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1727427143

   > @dzhigimont Can we just make the CI pass for now? I can help in the follow-ups after merging this one.
   > 
   > Seems like the mypy checks is failing for now:
   > 
   > ```
   > starting mypy annotations test...
   > annotations failed mypy checks:
   > python/pyspark/pandas/namespace.py:162: error: Cannot assign multiple types to name "_range" without an explicit "Type[...]" annotation  [misc]
   > python/pyspark/pandas/indexes/base.py:2075: error: Unused "type: ignore" comment
   > python/pyspark/pandas/indexes/base.py:2145: error: Unused "type: ignore" comment
   > Found 3 errors in 2 files (checked 703 source files)
   > ```
   > 
   > To resolve them:
   > 
   > * [ ]  Remove "type: ignore" comment from python/pyspark/pandas/indexes/base.py:2075
   > * [ ]  Remove "type: ignore" comment from python/pyspark/pandas/indexes/base.py:2145
   > * [ ]  Add "# type: ignore" comment to python/pyspark/pandas/namespace.py:162
   
   Fixed


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1528926660

   Can you rebase to master and try running linter again??
   If the problem still exists, yes, let's fix the `mypy` failure. We should make the PR pass the CI anyway.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1728575805

   Yeah, mypy check always tricky 😂  Let me take a look


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1692633152

   @dzhigimont Could you proceed this PR if you're still interested on this work?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] bjornjorgensen commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "bjornjorgensen (via GitHub)" <gi...@apache.org>.
bjornjorgensen commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1575528446

   @dzhigimont we have upgraded main branch to pandas 2.0.2 now. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1469020494

   cc @itholic 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1524465392

   Could you resolve mypy check? You can run the static analysis by running `dev/lint-python` locally.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dzhigimont commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "dzhigimont (via GitHub)" <gi...@apache.org>.
dzhigimont commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1316957666


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,55 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        Examples
+        --------
+        >>> dfs = ps.from_pandas(pd.date_range(start='2019-12-29', freq='D', periods=4).to_series())
+        >>> dfs.dt.isocalendar()
+                    year  week  day
+        2019-12-29  2019    52    7
+        2019-12-30  2020     1    1
+        2019-12-31  2020     1    2
+        2020-01-01  2020     1    3
+        >>> dfs.dt.isocalendar().week
+        2019-12-29    52
+        2019-12-30     1
+        2019-12-31     1
+        2020-01-01     1
+        Name: week, dtype: int64
+        """
+
+        return_types = [self._data.index.dtype, int, int, int]
+
+        def pandas_isocalendar(  # type: ignore[no-untyped-def]
+            pdf,
+        ) -> ps.DataFrame[return_types]:  # type: ignore[valid-type]
+            # cast to int64 due to UInt32 is not supported by spark

Review Comment:
   Added a note



##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,55 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        Examples
+        --------
+        >>> dfs = ps.from_pandas(pd.date_range(start='2019-12-29', freq='D', periods=4).to_series())
+        >>> dfs.dt.isocalendar()
+                    year  week  day
+        2019-12-29  2019    52    7
+        2019-12-30  2020     1    1
+        2019-12-31  2020     1    2
+        2020-01-01  2020     1    3

Review Comment:
   Added



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dzhigimont commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "dzhigimont (via GitHub)" <gi...@apache.org>.
dzhigimont commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1692958737

   > @dzhigimont Could you proceed this PR if you're still interested on this work?
   
   Sure, sorry for the delay I was a little bit overloaded on my work


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1310665191


##########
python/pyspark/pandas/tests/indexes/test_datetime.py:
##########
@@ -269,6 +256,10 @@ def test_map(self):
         mapper_pser = pd.Series([1, 2, 3], index=pidx)
         self.assert_eq(psidx.map(mapper_pser), pidx.map(mapper_pser))
 
+    def test_isocalendar(self):

Review Comment:
   If we want to add a new test, then I think it's better to also move the related tests from "test_properties" to here.



##########
python/pyspark/pandas/tests/indexes/test_datetime.py:
##########
@@ -101,22 +102,8 @@ def test_properties(self):
                 self.assert_eq(psidx.day_of_week, pidx.day_of_week)
 
         if LooseVersion(pd.__version__) >= LooseVersion("2.0.0"):

Review Comment:
   Let's remove this to focus on testing for the latest pandas version.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1310662293


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,55 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        Examples
+        --------
+        >>> dfs = ps.from_pandas(pd.date_range(start='2019-12-29', freq='D', periods=4).to_series())
+        >>> dfs.dt.isocalendar()
+                    year  week  day
+        2019-12-29  2019    52    7
+        2019-12-30  2020     1    1
+        2019-12-31  2020     1    2
+        2020-01-01  2020     1    3
+        >>> dfs.dt.isocalendar().week
+        2019-12-29    52
+        2019-12-30     1
+        2019-12-31     1
+        2020-01-01     1
+        Name: week, dtype: int64
+        """
+
+        return_types = [self._data.index.dtype, int, int, int]
+
+        def pandas_isocalendar(  # type: ignore[no-untyped-def]
+            pdf,
+        ) -> ps.DataFrame[return_types]:  # type: ignore[valid-type]
+            # cast to int64 due to UInt32 is not supported by spark

Review Comment:
   > cast to int64 due to UInt32 is not supported by spark
    
   Is this mean that the result is different from pandas ?? If so, let's add a "Note" to the docstring so that users recognize this difference instead of just adding the comment here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1330865687


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,59 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        .. note:: Returns have int64 type instead of UInt32 as is in pandas due to UInt32
+            is not supported by spark
+
+        Examples
+        --------

Review Comment:
   @dzhigimont Could you take a look at this comment when you find some time? We can do it in separate PR in the future if you have not enough time for this for now :-)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1469854808

   Cc @Yikun too if you find some time to review.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1523814214

   cc @itholic @zhengruifeng @xinrong-meng if you find some time to review.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1332376264


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,59 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        .. note:: Returns have int64 type instead of UInt32 as is in pandas due to UInt32
+            is not supported by spark
+
+        Examples
+        --------

Review Comment:
   Sure! It's not very urgent, so please take your time



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1732912031

   LGTM now.
   
   Thanks for your consistent work on this!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1693074343

   No worries! I ping you just for reminder :-) Please take your time


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1333745403


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,57 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        .. note:: Returns have int64 type instead of UInt32 as is in pandas due to UInt32
+            is not supported by spark
+
+        Examples
+        --------
+        >>> dfs = ps.from_pandas(pd.date_range(start='2019-12-29', freq='D', periods=4).to_series())
+        >>> dfs.dt.isocalendar()
+                    year  week  day
+        2019-12-29  2019    52    7
+        2019-12-30  2020     1    1
+        2019-12-31  2020     1    2
+        2020-01-01  2020     1    3
+
+        >>> dfs.dt.isocalendar().week
+        2019-12-29    52
+        2019-12-30     1
+        2019-12-31     1
+        2020-01-01     1
+        Name: week, dtype: int64
+        """
+
+        def pandas_isocalendar(
+            pdf: pd.DataFrame,
+        ) -> ps.DataFrame[Any]:

Review Comment:
   Oh, sorry we can't use such typing for Pandas UDF. Let's go back to the previous way.
   ```suggestion
           return_types = [self._data.index.dtype, int, int, int]
   
           def pandas_isocalendar(
               pdf: pd.DataFrame,
           ) -> ps.DataFrame[return_types]:  # type: ignore[valid-type]
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1726767540

   @dzhigimont Can we just make the CI pass for now? I can help in the follow-ups after merging this one.
   
   Seems like the mypy checks is failing for now:
   ```
   starting mypy annotations test...
   annotations failed mypy checks:
   python/pyspark/pandas/namespace.py:162: error: Cannot assign multiple types to name "_range" without an explicit "Type[...]" annotation  [misc]
   python/pyspark/pandas/indexes/base.py:2075: error: Unused "type: ignore" comment
   python/pyspark/pandas/indexes/base.py:2145: error: Unused "type: ignore" comment
   Found 3 errors in 2 files (checked 703 source files)
   ```
   
   To resolve them:
   - [ ] Remove "type: ignore" comment from python/pyspark/pandas/indexes/base.py:2075
   - [ ] Remove "type: ignore" comment from python/pyspark/pandas/indexes/base.py:2145
   - [ ] Add "# type: ignore" to python/pyspark/pandas/namespace.py:162


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1732993368

   Merged to master.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1319259812


##########
python/pyspark/pandas/indexes/datetimes.py:
##########
@@ -214,28 +215,8 @@ def microsecond(self) -> Index:
         )
         return Index(self.to_series().dt.microsecond)
 
-    @property
-    def week(self) -> Index:
-        """
-        The week ordinal of the year.
-
-        .. deprecated:: 3.5.0
-        """
-        warnings.warn(
-            "`week` is deprecated in 3.5.0 and will be removed in 4.0.0.",
-            FutureWarning,
-        )
-        return Index(self.to_series().dt.week)
-
-    @property
-    def weekofyear(self) -> Index:
-        warnings.warn(
-            "`weekofyear` is deprecated in 3.5.0 and will be removed in 4.0.0.",
-            FutureWarning,
-        )
-        return Index(self.to_series().dt.weekofyear)
-
-    weekofyear.__doc__ = week.__doc__
+    def isocalendar(self) -> DataFrame:

Review Comment:
   Do we need docstring?



##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,59 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        .. note:: Returns have int64 type instead of UInt32 as is in pandas due to UInt32
+            is not supported by spark
+
+        Examples
+        --------

Review Comment:
   Can we consider & have an example including `pd.NaT`? Seems like this case is not working currently:
   
   **Pandas**
   ```python
   >>> ser = pd.to_datetime(pd.Series(["2010-01-01", pd.NaT]))
   >>> ser.dt.isocalendar()
      year  week  day
   0  2009    53     5
   1  <NA>  <NA>  <NA>
   ```
   
   **Current implementation**
   ```python
   >>> ser = pd.to_datetime(pd.Series(["2010-01-01", pd.NaT]))
   >>> psser = ps.from_pandas(ser)
   # ValueError: cannot convert NA to integer
   ```
   
   In Spark, we can't use mixed type within single column, so we should convert `NA` to proper type (e.g. use NaN instead of `NA` for float type in this case) as below:
   
   ```python
   >>> psser.dt.week
   0    53.0
   1     NaN
   dtype: float64
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dzhigimont commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "dzhigimont (via GitHub)" <gi...@apache.org>.
dzhigimont commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1527398713

   > Could you resolve mypy check? You can run the static analysis by running `dev/lint-python` locally.
   
   @itholic  I've fixed the issue with mypy in my changes, but I still see three problems that don't relate to changes. Should I fix them?
   
   ```
   annotations failed mypy checks:
   python/pyspark/pandas/namespace.py:162: error: Cannot assign multiple types to name "_range" without an explicit "Type[...]" annotation  [misc]
   python/pyspark/pandas/indexes/base.py:2093: error: unused "type: ignore" comment
   python/pyspark/pandas/indexes/base.py:2163: error: unused "type: ignore" comment
   Found 3 errors in 2 files (checked 507 source files)
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dzhigimont commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "dzhigimont (via GitHub)" <gi...@apache.org>.
dzhigimont commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1331412813


##########
python/pyspark/pandas/indexes/datetimes.py:
##########
@@ -214,28 +215,8 @@ def microsecond(self) -> Index:
         )
         return Index(self.to_series().dt.microsecond)
 
-    @property
-    def week(self) -> Index:
-        """
-        The week ordinal of the year.
-
-        .. deprecated:: 3.5.0
-        """
-        warnings.warn(
-            "`week` is deprecated in 3.5.0 and will be removed in 4.0.0.",
-            FutureWarning,
-        )
-        return Index(self.to_series().dt.week)
-
-    @property
-    def weekofyear(self) -> Index:
-        warnings.warn(
-            "`weekofyear` is deprecated in 3.5.0 and will be removed in 4.0.0.",
-            FutureWarning,
-        )
-        return Index(self.to_series().dt.weekofyear)
-
-    weekofyear.__doc__ = week.__doc__
+    def isocalendar(self) -> DataFrame:

Review Comment:
   Added



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dzhigimont commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "dzhigimont (via GitHub)" <gi...@apache.org>.
dzhigimont commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1331413958


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,59 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        .. note:: Returns have int64 type instead of UInt32 as is in pandas due to UInt32
+            is not supported by spark
+
+        Examples
+        --------

Review Comment:
   If you give me 1-2 days I will try to resolve the proble



##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,59 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        .. note:: Returns have int64 type instead of UInt32 as is in pandas due to UInt32
+            is not supported by spark
+
+        Examples
+        --------

Review Comment:
   If you give me 1-2 days I will try to resolve the problem



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon closed pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon closed pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0
URL: https://github.com/apache/spark/pull/40420


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] dzhigimont commented on pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "dzhigimont (via GitHub)" <gi...@apache.org>.
dzhigimont commented on PR #40420:
URL: https://github.com/apache/spark/pull/40420#issuecomment-1699498241

   Updated the branch


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40420: [SPARK-42617][PS] Support `isocalendar` from the pandas 2.0.0

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40420:
URL: https://github.com/apache/spark/pull/40420#discussion_r1310660048


##########
python/pyspark/pandas/datetimes.py:
##########
@@ -116,26 +117,55 @@ def pandas_microsecond(s) -> ps.Series[np.int32]:  # type: ignore[no-untyped-def
     def nanosecond(self) -> "ps.Series":
         raise NotImplementedError()
 
-    # TODO(SPARK-42617): Support isocalendar.week and replace it.
-    # See also https://github.com/pandas-dev/pandas/pull/33595.
-    @property
-    def week(self) -> "ps.Series":
+    def isocalendar(self) -> "ps.DataFrame":
         """
-        The week ordinal of the year.
+        Calculate year, week, and day according to the ISO 8601 standard.
 
-        .. deprecated:: 3.4.0
-        """
-        warnings.warn(
-            "weekofyear and week have been deprecated.",
-            FutureWarning,
-        )
-        return self._data.spark.transform(lambda c: F.weekofyear(c).cast(LongType()))
+            .. versionadded:: 4.0.0
 
-    @property
-    def weekofyear(self) -> "ps.Series":
-        return self.week
+        Returns
+        -------
+        DataFrame
+            With columns year, week and day.
 
-    weekofyear.__doc__ = week.__doc__
+        Examples
+        --------
+        >>> dfs = ps.from_pandas(pd.date_range(start='2019-12-29', freq='D', periods=4).to_series())
+        >>> dfs.dt.isocalendar()
+                    year  week  day
+        2019-12-29  2019    52    7
+        2019-12-30  2020     1    1
+        2019-12-31  2020     1    2
+        2020-01-01  2020     1    3

Review Comment:
   nit: Could you add a new line between each examples to split them when displaying in the documents.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org