You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "bersprockets (via GitHub)" <gi...@apache.org> on 2023/02/15 02:34:03 UTC

[GitHub] [spark] bersprockets opened a new pull request, #40026: [SPARK-42401][SQL][FOLLOWUP] Always set `containsNull=true` for `array_insert`

bersprockets opened a new pull request, #40026:
URL: https://github.com/apache/spark/pull/40026

   ### What changes were proposed in this pull request?
   
   Always set `containsNull=true` in the data type returned by `ArrayInsert#dataType`.
   
   ### Why are the changes needed?
   
   PR #39970 fixed an issue where the data type for `array_insert` did not always have `containsNull=true` when the user was explicitly inserting a nullable value into the array. However, that fix does not handle the case where `array_insert` implicitly inserts null values into the array (e.g., when the insertion position is out-of-range):
   ```
   spark-sql> select array_insert(array('1', '2', '3', '4'), -6, '5');
   23/02/14 16:10:19 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
   java.lang.NullPointerException
   	at org.apache.spark.sql.catalyst.expressions.codegen.UnsafeWriter.write(UnsafeWriter.java:110)
   	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.project_doConsume_0$(Unknown Source)
   	at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
   	at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
   ```
   Because we can't know at planning time whether the insertion position will be out of range, we should always set `containsNull=true` on the data type for `array_insert`.
   
   ### Does this PR introduce _any_ user-facing change?
   
   No.
   
   ### How was this patch tested?
   
   New unit tests.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon closed pull request #40026: [SPARK-42401][SQL][FOLLOWUP] Always set `containsNull=true` for `array_insert`

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon closed pull request #40026: [SPARK-42401][SQL][FOLLOWUP] Always set `containsNull=true` for `array_insert`
URL: https://github.com/apache/spark/pull/40026


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] HyukjinKwon commented on pull request #40026: [SPARK-42401][SQL][FOLLOWUP] Always set `containsNull=true` for `array_insert`

Posted by "HyukjinKwon (via GitHub)" <gi...@apache.org>.
HyukjinKwon commented on PR #40026:
URL: https://github.com/apache/spark/pull/40026#issuecomment-1431231926

   Merged to master and branch-3.4.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org