You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/12/14 00:20:15 UTC

[GitHub] [spark] maropu commented on a change in pull request #26886: [SPARK-30231][SQL][PYTHON][FOLLOWUP] Make error messages clear in PySpark df.explain

maropu commented on a change in pull request #26886: [SPARK-30231][SQL][PYTHON][FOLLOWUP] Make error messages clear in PySpark df.explain
URL: https://github.com/apache/spark/pull/26886#discussion_r357878859
 
 

 ##########
 File path: python/pyspark/sql/dataframe.py
 ##########
 @@ -305,11 +305,11 @@ def explain(self, extended=None, mode=None):
         is_mode_case = mode is not None and isinstance(mode, basestring)
 
         if not is_no_argument and not (is_extended_case or is_mode_case):
-            argtypes = [
-                str(type(arg)) for arg in [extended, mode] if arg is not None]
-            raise TypeError(
-                "extended (optional) and mode (optional) should be a bool and str; "
-                "however, got [%s]." % ", ".join(argtypes))
+            if extended is not None:
+                errMsg = "extended should be provided as bool, got {0}".format(type(extended))
+            else:  # For mode case
+                errMsg = "mode should be provided as str, got {0}".format(type(mode))
+            raise TypeError(errMsg)
 
 Review comment:
   How about this error handling? @viirya https://github.com/apache/spark/pull/26861#discussion_r357726827

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org