You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/08/23 00:15:28 UTC

[GitHub] [spark] mhconradt commented on pull request #37616: [SPARK-40178][PYTHON][SQL] Fix partitioning hint parameters in PySpark

mhconradt commented on PR #37616:
URL: https://github.com/apache/spark/pull/37616#issuecomment-1223355455

   Without that additional if-else this code would raise a TypeError. The rationale is _to_java_column only supports str and Column, so we only use it to convert those types of parameters, and don’t apply additional conversions to other types.
   
   Sent from Proton Mail for iOS
   
   On Mon, Aug 22, 2022 at 08:23, Hyukjin Kwon ***@***.***> wrote:
   
   > @HyukjinKwon commented on this pull request.
   >
   > ---------------------------------------------------------------
   >
   > In [python/pyspark/sql/dataframe.py](https://github.com/apache/spark/pull/37616#discussion_r951436056):
   >
   >> +        jdf = self._jdf.hint(name, self._jseq(parameters,
   > +                                              converter=lambda x: _to_java_expr(x) if isinstance(x, (Column, str)) else x))
   >
   > There;s a duplicate if-else in _to_java_expr
   >
   > —
   > Reply to this email directly, [view it on GitHub](https://github.com/apache/spark/pull/37616#pullrequestreview-1080486055), or [unsubscribe](https://github.com/notifications/unsubscribe-auth/AKN47WEJD6JFIXEWUKVQMCDV2N5OXANCNFSM57HMQKMA).
   > You are receiving this because you authored the thread.Message ID: ***@***.***>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org