You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takuya Ueshin (JIRA)" <ji...@apache.org> on 2016/11/10 09:41:58 UTC

[jira] [Created] (SPARK-18398) Fix nullabilities of MapObjects and optimize not to check null if lambda is not nullable.

Takuya Ueshin created SPARK-18398:
-------------------------------------

             Summary: Fix nullabilities of MapObjects and optimize not to check null if lambda is not nullable.
                 Key: SPARK-18398
                 URL: https://issues.apache.org/jira/browse/SPARK-18398
             Project: Spark
          Issue Type: Improvement
          Components: SQL
            Reporter: Takuya Ueshin


The nullabilities of {{MapObject}} can be made more strict by relying on {{inputObject.nullable}} and {{lambdaFunction.nullable}}.

And we can optimize its execution a little to skip extra null check if the lambda is not nullable.

The example of generated code before:

{code}
boolean isNull4 = i.isNullAt(0);
ArrayData value4 = isNull4 ? null : (i.getArray(0));
ArrayData value3 = null;

if (!isNull4) {
    Integer[] convertedArray = null;
    int dataLength = value4.numElements();
    convertedArray = new Integer[dataLength];

    int loopIndex = 0;
    while (loopIndex < dataLength) {
        MapObjects_loopValue108 = (int) (value4.getInt(loopIndex));
        MapObjects_loopIsNull109 = value4.isNullAt(loopIndex);

        if (MapObjects_loopIsNull109) {
            throw new RuntimeException(((java.lang.String) references[0]));
        }

        if (false) {
            convertedArray[loopIndex] = null;
        } else {
            convertedArray[loopIndex] = MapObjects_loopValue108;
        }

        loopIndex += 1;
    }

    value3 = new org.apache.spark.sql.catalyst.util.GenericArrayData(convertedArray);
}
{code}

after:

{code}
boolean isNull4 = i.isNullAt(0);
ArrayData value4 = isNull4 ? null : (i.getArray(0));
ArrayData value3 = null;

if (!isNull4) {
    Integer[] convertedArray = null;
    int dataLength = value4.numElements();
    convertedArray = new Integer[dataLength];

    int loopIndex = 0;
    while (loopIndex < dataLength) {
        MapObjects_loopValue108 = (int) (value4.getInt(loopIndex));
        MapObjects_loopIsNull109 = value4.isNullAt(loopIndex);

        if (MapObjects_loopIsNull109) {
            throw new RuntimeException(((java.lang.String) references[0]));
        }
        convertedArray[loopIndex] = MapObjects_loopValue108;

        loopIndex += 1;
    }

    value3 = new org.apache.spark.sql.catalyst.util.GenericArrayData(convertedArray);
}
{code}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org