You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "itholic (via GitHub)" <gi...@apache.org> on 2023/03/31 07:01:30 UTC

[GitHub] [spark] itholic opened a new pull request, #40617: [SPARK-42992][PYTHON] Introduce PySparkRuntimeError

itholic opened a new pull request, #40617:
URL: https://github.com/apache/spark/pull/40617

   ### What changes were proposed in this pull request?
   
   This PR proposes to introduce new error for PySpark.
   
   
   ### Why are the changes needed?
   
   To cover the built-in RuntimeError by PySpark error framework.
   
   
   ### Does this PR introduce _any_ user-facing change?
   
   No, it's internal error framework improvement.
   
   
   ### How was this patch tested?
   
   The existing CI should pass.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on pull request #40617: [SPARK-42992][PYTHON] Introduce PySparkRuntimeError

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on PR #40617:
URL: https://github.com/apache/spark/pull/40617#issuecomment-1522823970

   merged to master


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on pull request #40617: [SPARK-42992][PYTHON] Introduce PySparkRuntimeError

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on PR #40617:
URL: https://github.com/apache/spark/pull/40617#issuecomment-1521086780

   Applied the comments, thanks @zhengruifeng !


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40617: [SPARK-42992][PYTHON] Introduce PySparkRuntimeError

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40617:
URL: https://github.com/apache/spark/pull/40617#discussion_r1175958890


##########
python/pyspark/errors/error_classes.py:
##########
@@ -79,11 +114,26 @@
       "returnType can not be specified when `<arg_name>` is a user-defined function, but got <return_type>."
     ]
   },
+  "CANNOT_UNPERSIST_BROADCAST": {
+    "message": [
+      "Broadcast can only be unpersisted in driver."
+    ]
+  },
   "COLUMN_IN_LIST": {
     "message": [
       "`<func_name>` does not allow a Column in a list."
     ]
   },
+  "CONTEXT_ONLY_VALID_ON_DRIVER" : {
+    "message" : [
+      "It appears that you are attempting to reference SparkContext from a broadcast variable, action, or transformation. SparkContext can only be used on the driver, not in code that it run on workers. For more information, see SPARK-5063."
+    ]
+  },
+  "CONTEXT_UNAVAILABLE_FOR_REMOTE_CLIENT" : {
+    "message" : [
+      "Remote client cannot create a SparkContext. Create SparkSession instead."

Review Comment:
   Sorry, I didn't catch the point. The message suggests using a Spark session because it is not possible to create a SparkContext in a remote session. Do you have any proposed suggestion?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] itholic commented on a diff in pull request #40617: [SPARK-42992][PYTHON] Introduce PySparkRuntimeError

Posted by "itholic (via GitHub)" <gi...@apache.org>.
itholic commented on code in PR #40617:
URL: https://github.com/apache/spark/pull/40617#discussion_r1175953882


##########
python/pyspark/errors/error_classes.py:
##########
@@ -79,11 +114,26 @@
       "returnType can not be specified when `<arg_name>` is a user-defined function, but got <return_type>."
     ]
   },
+  "CANNOT_UNPERSIST_BROADCAST": {

Review Comment:
   Sounds good. Seems like we can consolidate `CANNOT_DESTROY_BROADCAST` as well :-)



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on a diff in pull request #40617: [SPARK-42992][PYTHON] Introduce PySparkRuntimeError

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #40617:
URL: https://github.com/apache/spark/pull/40617#discussion_r1175926004


##########
python/pyspark/errors/error_classes.py:
##########
@@ -79,11 +114,26 @@
       "returnType can not be specified when `<arg_name>` is a user-defined function, but got <return_type>."
     ]
   },
+  "CANNOT_UNPERSIST_BROADCAST": {

Review Comment:
   is it possible to combine `CANNOT_UNPERSIST_BROADCAST` `CANNOT_REDUCE_BROADCAST` and `CANNOT_DESTROY_BROADCAST` to a single `INVALID_BROADCAST_OPERATION` with parameter `operation`



##########
python/pyspark/errors/error_classes.py:
##########
@@ -29,6 +34,21 @@
       "Attribute `<attr_name>` in provided object `<obj_name>` is not callable."
     ]
   },
+  "BARRIER_TASK_CONTEXT_NOT_INITIALIZE": {
+    "message": [
+      "Not supported to call `<func_name>` before initialize BarrierTaskContext."
+    ]
+  },
+  "BROADCAST_VARIABLE_NOT_LOADED": {
+    "message": [
+      "Broadcast variable `<variable>` not loaded."
+    ]
+  },
+  "CALL_BEFORE_SESSION_INITIALIZE" : {

Review Comment:
   what about combine `BARRIER_TASK_CONTEXT_NOT_INITIALIZE` and `CALL_BEFORE_SESSION_INITIALIZE` to a general `CALL_BEFORE_INITIALIZE` with an extra parameter `object`?



##########
python/pyspark/errors/error_classes.py:
##########
@@ -79,11 +114,26 @@
       "returnType can not be specified when `<arg_name>` is a user-defined function, but got <return_type>."
     ]
   },
+  "CANNOT_UNPERSIST_BROADCAST": {
+    "message": [
+      "Broadcast can only be unpersisted in driver."
+    ]
+  },
   "COLUMN_IN_LIST": {
     "message": [
       "`<func_name>` does not allow a Column in a list."
     ]
   },
+  "CONTEXT_ONLY_VALID_ON_DRIVER" : {
+    "message" : [
+      "It appears that you are attempting to reference SparkContext from a broadcast variable, action, or transformation. SparkContext can only be used on the driver, not in code that it run on workers. For more information, see SPARK-5063."
+    ]
+  },
+  "CONTEXT_UNAVAILABLE_FOR_REMOTE_CLIENT" : {
+    "message" : [
+      "Remote client cannot create a SparkContext. Create SparkSession instead."

Review Comment:
   I think the message is misleading, it implies a `SparkSession` was created.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng commented on a diff in pull request #40617: [SPARK-42992][PYTHON] Introduce PySparkRuntimeError

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on code in PR #40617:
URL: https://github.com/apache/spark/pull/40617#discussion_r1175974399


##########
python/pyspark/errors/error_classes.py:
##########
@@ -79,11 +114,26 @@
       "returnType can not be specified when `<arg_name>` is a user-defined function, but got <return_type>."
     ]
   },
+  "CANNOT_UNPERSIST_BROADCAST": {
+    "message": [
+      "Broadcast can only be unpersisted in driver."
+    ]
+  },
   "COLUMN_IN_LIST": {
     "message": [
       "`<func_name>` does not allow a Column in a list."
     ]
   },
+  "CONTEXT_ONLY_VALID_ON_DRIVER" : {
+    "message" : [
+      "It appears that you are attempting to reference SparkContext from a broadcast variable, action, or transformation. SparkContext can only be used on the driver, not in code that it run on workers. For more information, see SPARK-5063."
+    ]
+  },
+  "CONTEXT_UNAVAILABLE_FOR_REMOTE_CLIENT" : {
+    "message" : [
+      "Remote client cannot create a SparkContext. Create SparkSession instead."

Review Comment:
   NVM, I feel it says that `Remote client cannot create a SparkContext.  So the SparkConnect Python Client create a session instead`, while it actually raise a error and fails.
   Not a big deal.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] zhengruifeng closed pull request #40617: [SPARK-42992][PYTHON] Introduce PySparkRuntimeError

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng closed pull request #40617: [SPARK-42992][PYTHON] Introduce PySparkRuntimeError
URL: https://github.com/apache/spark/pull/40617


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org