You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "LuciferYang (via GitHub)" <gi...@apache.org> on 2024/01/23 11:23:56 UTC

Re: [PR] [SPARK-46795][SQL] Replace `UnsupportedOperationException` by `SparkUnsupportedOperationException` in `sql/core` [spark]

LuciferYang commented on code in PR #44772:
URL: https://github.com/apache/spark/pull/44772#discussion_r1463126058


##########
sql/core/src/main/java/org/apache/spark/sql/execution/datasources/orc/OrcArrayColumnVector.java:
##########
@@ -52,61 +53,61 @@ public ColumnarArray getArray(int rowId) {
 
   @Override
   public boolean getBoolean(int rowId) {
-    throw new UnsupportedOperationException();
+    throw SparkUnsupportedOperationException.apply();

Review Comment:
   There might be some omissions, such as:
   
   https://github.com/apache/spark/blob/bc889c8c2adb976bdf3e7f020e3c7c3de5339c54/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/orc/OrcAtomicColumnVector.java#L74
   
   And there are some uses of the `UnsupportedOperationException` constructor with non-null arguments in the current code, such as: 
   
   https://github.com/apache/spark/blob/bc889c8c2adb976bdf3e7f020e3c7c3de5339c54/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedRleValuesReader.java#L787-L795
   
   https://github.com/apache/spark/blob/bc889c8c2adb976bdf3e7f020e3c7c3de5339c54/sql/core/src/main/java/org/apache/spark/sql/execution/vectorized/ColumnVectorUtils.java#L185
   
   Are they not within the scope of modifications for the current PR?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org