You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by wuchong <gi...@git.apache.org> on 2017/02/17 08:20:38 UTC

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

GitHub user wuchong opened a pull request:

    https://github.com/apache/flink/pull/3338

    [FLINK-5414] [table] Bump up Calcite version to 1.11

    Thanks for contributing to Apache Flink. Before you open your pull request, please take the following check list into consideration.
    If your changes take all of the items into account, feel free to open your pull request. For more information and/or questions please refer to the [How To Contribute guide](http://flink.apache.org/how-to-contribute.html).
    In addition to going through the list, please provide a meaningful description of your changes.
    
    - [ ] General
      - The pull request references the related JIRA issue ("[FLINK-XXX] Jira title text")
      - The pull request addresses only one issue
      - Each commit in the PR has a meaningful commit message (including the JIRA id)
    
    - [ ] Documentation
      - Documentation has been added for new functionality
      - Old documentation affected by the pull request has been updated
      - JavaDoc for public methods has been added
    
    - [ ] Tests & Build
      - Functionality added by the pull request is covered by tests
      - `mvn clean verify` has been executed successfully locally or a Travis build has passed
    
    
    This PR upgrade Calcite to version 1.11. But there are a lot of compatibility issues. I fixed them. Correct me if the way of fixing is wrong.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/wuchong/flink calcite-FLINK-5414

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/3338.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #3338
    
----
commit d3d4c914325c1591c1bd5b0e5081df020cda0507
Author: Jark Wu <wu...@alibaba-inc.com>
Date:   2017-02-17T05:17:43Z

    [FLINK-5414] [table] Bump up Calcite version to 1.11

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

Posted by fhueske <gi...@git.apache.org>.
Github user fhueske commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3338#discussion_r101728056
  
    --- Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/ExpressionReducer.scala ---
    @@ -106,8 +106,16 @@ class ExpressionReducer(config: TableConfig)
             case SqlTypeName.ANY | SqlTypeName.ROW | SqlTypeName.ARRAY =>
               reducedValues.add(unreduced)
             case _ =>
    +          val reducedValue = reduced.getField(reducedIdx)
    +          // RexBuilder handle boolean literal incorrectly, convert it into BigDecimal manually
    --- End diff --
    
    Can you check the comment? Shouldn't "boolean" be "double"?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

Posted by haohui <gi...@git.apache.org>.
Github user haohui commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3338#discussion_r102117236
  
    --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala ---
    @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
           "DataSetCalc",
           batchTableNode(0),
           term("select",
    -        "13 AS _c0",
    +        "CAST(13) AS _c0",
    --- End diff --
    
    Is it possible to not changing the default nullability while adopting Calcite 1.11? Let me try it out as well.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

Posted by twalthr <gi...@git.apache.org>.
Github user twalthr commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3338#discussion_r101977305
  
    --- Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/plan/ProjectionTranslator.scala ---
    @@ -108,62 +109,73 @@ object ProjectionTranslator {
           tableEnv: TableEnvironment,
           aggNames: Map[Expression, String],
           propNames: Map[Expression, String]): Seq[NamedExpression] = {
    -    exprs.map(replaceAggregationsAndProperties(_, tableEnv, aggNames, propNames))
    -        .map(UnresolvedAlias)
    -  }
     
    -  private def replaceAggregationsAndProperties(
    +    val projectNames: mutable.HashSet[String] = new mutable.HashSet[String]
    +
    +    def replaceAggregationsAndProperties(
    --- End diff --
    
    Can you rename this or the outer function? Having two functions with the same names is confusing.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

Posted by wuchong <gi...@git.apache.org>.
Github user wuchong commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3338#discussion_r102040448
  
    --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala ---
    @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
           "DataSetCalc",
           batchTableNode(0),
           term("select",
    -        "13 AS _c0",
    +        "CAST(13) AS _c0",
    --- End diff --
    
    Yes, because I changed the default nullable to `true`, but the reduced constant is `NOT NULL`, so a `CAST` is here. Do you have any ideas to fix this? 
    
    The default nullable changed to `true` is because `UserDefinedScalarFunctionTest.testResults` and `ArrayTypeTest.testArrayLiterals` fail.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

Posted by twalthr <gi...@git.apache.org>.
Github user twalthr commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3338#discussion_r101977849
  
    --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala ---
    @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
           "DataSetCalc",
           batchTableNode(0),
           term("select",
    -        "13 AS _c0",
    +        "CAST(13) AS _c0",
    --- End diff --
    
    Do you know why there are so many unnecessary casts? Is it because of the different nullability?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3338: [FLINK-5414] [table] Bump up Calcite version to 1.11

Posted by haohui <gi...@git.apache.org>.
Github user haohui commented on the issue:

    https://github.com/apache/flink/pull/3338
  
    Looks good to me overall. One question -- I wonder, does it mean that all array types become nullable after this change?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3338: [FLINK-5414] [table] Bump up Calcite version to 1.11

Posted by fhueske <gi...@git.apache.org>.
Github user fhueske commented on the issue:

    https://github.com/apache/flink/pull/3338
  
    @twalthr, you might want to have a look at the `FlinkTypeFactory` changes


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/flink/pull/3338


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

Posted by twalthr <gi...@git.apache.org>.
Github user twalthr commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3338#discussion_r102046284
  
    --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala ---
    @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
           "DataSetCalc",
           batchTableNode(0),
           term("select",
    -        "13 AS _c0",
    +        "CAST(13) AS _c0",
    --- End diff --
    
    I will have a look at it again. In general, the only real solution is finally fix FLINK-5177.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

Posted by wuchong <gi...@git.apache.org>.
Github user wuchong commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3338#discussion_r101750708
  
    --- Diff: flink-libraries/flink-table/src/main/scala/org/apache/flink/table/codegen/ExpressionReducer.scala ---
    @@ -106,8 +106,16 @@ class ExpressionReducer(config: TableConfig)
             case SqlTypeName.ANY | SqlTypeName.ROW | SqlTypeName.ARRAY =>
               reducedValues.add(unreduced)
             case _ =>
    +          val reducedValue = reduced.getField(reducedIdx)
    +          // RexBuilder handle boolean literal incorrectly, convert it into BigDecimal manually
    --- End diff --
    
    Yes, it should be `double`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink pull request #3338: [FLINK-5414] [table] Bump up Calcite version to 1....

Posted by haohui <gi...@git.apache.org>.
Github user haohui commented on a diff in the pull request:

    https://github.com/apache/flink/pull/3338#discussion_r102125031
  
    --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/table/ExpressionReductionTest.scala ---
    @@ -155,15 +155,15 @@ class ExpressionReductionTest extends TableTestBase {
           "DataSetCalc",
           batchTableNode(0),
           term("select",
    -        "13 AS _c0",
    +        "CAST(13) AS _c0",
    --- End diff --
    
    Just played around a little bit. I think the problem is that the advanced types are not properly canonized. Using the following diff can pass all tests in `ArrayTypeTest`:
    
    ```
    --- a/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala
    +++ b/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala
    @@ -133,12 +133,18 @@ class FlinkTypeFactory(typeSystem: RelDataTypeSystem) extends JavaTypeFactoryImp
       override def createTypeWithNullability(
         relDataType: RelDataType,
         nullable: Boolean)
    -  : RelDataType = relDataType match {
    -    case composite: CompositeRelDataType =>
    -      // at the moment we do not care about nullability
    -      composite
    -    case _ =>
    -      super.createTypeWithNullability(relDataType, nullable)
    +  : RelDataType = {
    +    val t = relDataType match {
    +      case composite: CompositeRelDataType =>
    +        // at the moment we do not care about nullability
    +        composite
    +      case array: ArrayRelDataType =>
    +        val elementType = canonize(createTypeWithNullability(array.getComponentType, nullable))
    +        new ArrayRelDataType(array.typeInfo, elementType, nullable)
    +      case _ =>
    +        super.createTypeWithNullability(relDataType, nullable)
    +    }
    +    canonize(t)
       }
     }
    ```
    
    GroupWindowTest is still failing as it misses an identity projection. I'm wondering why `ProjectRemoveRule.INSTANCE` did not kick in...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3338: [FLINK-5414] [table] Bump up Calcite version to 1.11

Posted by fhueske <gi...@git.apache.org>.
Github user fhueske commented on the issue:

    https://github.com/apache/flink/pull/3338
  
    Thanks for the update @wuchong.
    PR looks good to me. Feel free to merge :-)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3338: [FLINK-5414] [table] Bump up Calcite version to 1.11

Posted by wuchong <gi...@git.apache.org>.
Github user wuchong commented on the issue:

    https://github.com/apache/flink/pull/3338
  
    Hi @fhueske , yes, Calcite forces an Calc after each aggregate that only renames fields, because we rename every aggregates in Table API which is not necessary. I changed the logic of getting projections on aggregates to only rename the duplicate aggregates. And that works good, no more Calc appended.
    
    Hi @haohui , the ArrayRelDataType is still NOT NULL. I reverted [that line](https://github.com/apache/flink/blob/master/flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala#L121) which is not need to be changed in this PR.
    
    Cheers,
    Jark Wu



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] flink issue #3338: [FLINK-5414] [table] Bump up Calcite version to 1.11

Posted by docete <gi...@git.apache.org>.
Github user docete commented on the issue:

    https://github.com/apache/flink/pull/3338
  
    @wuchong @haohui Could u help to check FLINK-6173 ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---