You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@calcite.apache.org by "jackylau (Jira)" <ji...@apache.org> on 2023/05/30 03:40:00 UTC

[jira] [Comment Edited] (CALCITE-5704) Add ARRAY_EXCEPT, ARRAY_INTERSECT and ARRAY_UNION for Spark dialect

    [ https://issues.apache.org/jira/browse/CALCITE-5704?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17727311#comment-17727311 ] 

jackylau edited comment on CALCITE-5704 at 5/30/23 3:39 AM:
------------------------------------------------------------

hi [~jiajunbernoulli] i response it in the pr.

fix it which refers to CONCAT and the result type will check 
{code:java}
public static final SqlBinaryOperator CONCAT =
    new SqlBinaryOperator(
        "||",
        SqlKind.OTHER,
        60,
        true,
        ReturnTypes.ARG0.andThen((opBinding, typeToTransform) ->
{           SqlReturnTypeInference returnType =               typeToTransform.getSqlTypeName().getFamily() == SqlTypeFamily.ARRAY                   ? ReturnTypes.LEAST_RESTRICTIVE                   : ReturnTypes.DYADIC_STRING_SUM_PRECISION_NULLABLE;           return requireNonNull(returnType.inferReturnType(opBinding),               "inferred CONCAT element type");         }
),
        null,
        OperandTypes.STRING_SAME_SAME_OR_ARRAY_SAME_SAME);{code}
for example 
ARRAY_EXCEPT(array[1], array['a'])  it will also throw exception. the only difference is the throw exception position changed, one is in result type, another is in operand type checker.
{code:java}
java.lang.IllegalArgumentException: Cannot infer return type for ARRAY_EXCEPT; operand types: [INTEGER ARRAY, CHAR(1) ARRAY] {code}
and some functions also does it too , such as
{code:java}
// code placeholder
  /** The "ARRAY(exp, ...)" function (Spark);
   * compare with the standard array value constructor, "ARRAY [exp, ...]". */
  @LibraryOperator(libraries = {SPARK})
  public static final SqlFunction ARRAY =
      SqlBasicFunction.create("ARRAY",
          SqlLibraryOperators::arrayReturnType,
          OperandTypes.SAME_VARIADIC); {code}


was (Author: jackylau):
hi [~jiajunbernoulli] i response it in the pr.

fix it which refers to CONCAT and the result type will check 

```
public static final SqlBinaryOperator CONCAT =
    new SqlBinaryOperator(
        "||",
        SqlKind.OTHER,
        60,
        true,
        ReturnTypes.ARG0.andThen((opBinding, typeToTransform) -> {
          SqlReturnTypeInference returnType =
              typeToTransform.getSqlTypeName().getFamily() == SqlTypeFamily.ARRAY
                  ? ReturnTypes.LEAST_RESTRICTIVE
                  : ReturnTypes.DYADIC_STRING_SUM_PRECISION_NULLABLE;

          return requireNonNull(returnType.inferReturnType(opBinding),
              "inferred CONCAT element type");
        }),
        null,
        OperandTypes.STRING_SAME_SAME_OR_ARRAY_SAME_SAME);
```


for example 
ARRAY_EXCEPT(array[1], array['a'])  it will also throw exception. the only difference is the throw exception position changed, one is in result type, another is in operand type checker.
```
java.lang.IllegalArgumentException: Cannot infer return type for ARRAY_EXCEPT; operand types: [INTEGER ARRAY, CHAR(1) ARRAY]
```

and some functions also does it too , such as
```
  /** The "ARRAY(exp, ...)" function (Spark);
   * compare with the standard array value constructor, "ARRAY [exp, ...]". */
  @LibraryOperator(libraries = \{SPARK})
  public static final SqlFunction ARRAY =
      SqlBasicFunction.create("ARRAY",
          SqlLibraryOperators::arrayReturnType,
          OperandTypes.SAME_VARIADIC);
```

> Add ARRAY_EXCEPT, ARRAY_INTERSECT and ARRAY_UNION for Spark dialect
> -------------------------------------------------------------------
>
>                 Key: CALCITE-5704
>                 URL: https://issues.apache.org/jira/browse/CALCITE-5704
>             Project: Calcite
>          Issue Type: Improvement
>          Components: core
>    Affects Versions: 1.35.0
>            Reporter: jackylau
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 1.35.0
>
>
> array_union(array1, array2) - Returns an array of the elements in the union of array1 and array2, without duplicates.
> array_intersect(array1, array2) - Returns an array of the elements in the intersection of array1 and array2, without duplicates.
> array_except(array1, array2) - Returns an array of the elements in array1 but not in array2, without duplicates.
> For more details
> [https://spark.apache.org/docs/latest/api/sql/index.html]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)