You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Herman van Hovell (JIRA)" <ji...@apache.org> on 2016/11/21 13:51:59 UTC

[jira] [Resolved] (SPARK-18398) Fix nullabilities of MapObjects and optimize not to check null if lambda is not nullable.

     [ https://issues.apache.org/jira/browse/SPARK-18398?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Herman van Hovell resolved SPARK-18398.
---------------------------------------
       Resolution: Fixed
         Assignee: Takuya Ueshin
    Fix Version/s: 2.1.0

> Fix nullabilities of MapObjects and optimize not to check null if lambda is not nullable.
> -----------------------------------------------------------------------------------------
>
>                 Key: SPARK-18398
>                 URL: https://issues.apache.org/jira/browse/SPARK-18398
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Takuya Ueshin
>            Assignee: Takuya Ueshin
>             Fix For: 2.1.0
>
>
> The nullabilities of {{MapObject}} can be made more strict by relying on {{inputObject.nullable}} and {{lambdaFunction.nullable}}.
> And we can optimize its execution a little to skip extra null check if the lambda is not nullable.
> The example of generated code before:
> {code}
> boolean isNull4 = i.isNullAt(0);
> ArrayData value4 = isNull4 ? null : (i.getArray(0));
> ArrayData value3 = null;
> if (!isNull4) {
>     Integer[] convertedArray = null;
>     int dataLength = value4.numElements();
>     convertedArray = new Integer[dataLength];
>     int loopIndex = 0;
>     while (loopIndex < dataLength) {
>         MapObjects_loopValue108 = (int) (value4.getInt(loopIndex));
>         MapObjects_loopIsNull109 = value4.isNullAt(loopIndex);
>         if (MapObjects_loopIsNull109) {
>             throw new RuntimeException(((java.lang.String) references[0]));
>         }
>         if (false) {
>             convertedArray[loopIndex] = null;
>         } else {
>             convertedArray[loopIndex] = MapObjects_loopValue108;
>         }
>         loopIndex += 1;
>     }
>     value3 = new org.apache.spark.sql.catalyst.util.GenericArrayData(convertedArray);
> }
> {code}
> after:
> {code}
> boolean isNull4 = i.isNullAt(0);
> ArrayData value4 = isNull4 ? null : (i.getArray(0));
> ArrayData value3 = null;
> if (!isNull4) {
>     Integer[] convertedArray = null;
>     int dataLength = value4.numElements();
>     convertedArray = new Integer[dataLength];
>     int loopIndex = 0;
>     while (loopIndex < dataLength) {
>         MapObjects_loopValue108 = (int) (value4.getInt(loopIndex));
>         MapObjects_loopIsNull109 = value4.isNullAt(loopIndex);
>         if (MapObjects_loopIsNull109) {
>             throw new RuntimeException(((java.lang.String) references[0]));
>         }
>         convertedArray[loopIndex] = MapObjects_loopValue108;
>         loopIndex += 1;
>     }
>     value3 = new org.apache.spark.sql.catalyst.util.GenericArrayData(convertedArray);
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org