You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2017/01/05 23:14:58 UTC

[jira] [Commented] (FLINK-5084) Replace Java Table API integration tests by unit tests

    [ https://issues.apache.org/jira/browse/FLINK-5084?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15802831#comment-15802831 ] 

ASF GitHub Bot commented on FLINK-5084:
---------------------------------------

Github user fhueske commented on a diff in the pull request:

    https://github.com/apache/flink/pull/2977#discussion_r94871520
  
    --- Diff: flink-libraries/flink-table/src/test/scala/org/apache/flink/api/scala/batch/table/SortValidationTest.scala ---
    @@ -0,0 +1,52 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one
    + * or more contributor license agreements.  See the NOTICE file
    + * distributed with this work for additional information
    + * regarding copyright ownership.  The ASF licenses this file
    + * to you under the Apache License, Version 2.0 (the
    + * "License"); you may not use this file except in compliance
    + * with the License.  You may obtain a copy of the License at
    + *
    + *     http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.flink.api.scala.batch.table
    +
    +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase
    +import org.apache.flink.api.scala.batch.utils.TableProgramsTestBase.TableConfigMode
    +import org.apache.flink.api.scala.table._
    +import org.apache.flink.api.scala.util.CollectionDataSets
    +import org.apache.flink.api.scala.{ExecutionEnvironment, _}
    +import org.apache.flink.api.table.{Row, TableEnvironment, ValidationException}
    +import org.apache.flink.test.util.MultipleProgramsTestBase.TestExecutionMode
    +import org.junit._
    +
    +class SortValidationTest(
    +  mode: TestExecutionMode,
    +  configMode: TableConfigMode)
    +  extends TableProgramsTestBase(mode, configMode) {
    --- End diff --
    
    `TableProgramsTestBase` should be removed.


> Replace Java Table API integration tests by unit tests
> ------------------------------------------------------
>
>                 Key: FLINK-5084
>                 URL: https://issues.apache.org/jira/browse/FLINK-5084
>             Project: Flink
>          Issue Type: Task
>          Components: Table API & SQL
>            Reporter: Fabian Hueske
>            Priority: Minor
>
> The Java Table API is a wrapper on top of the Scala Table API. 
> Instead of operating directly with Expressions like the Scala API, the Java API accepts a String parameter which is parsed into Expressions.
> We could therefore replace the Java Table API ITCases by tests that check that the parsing step produces a valid logical plan.
> This could be done by creating two {{Table}} objects for an identical query once with the Scala Expression API and one with the Java String API and comparing the logical plans of both {{Table}} objects. Basically something like the following:
> {code}
> val ds1 = CollectionDataSets.getSmall3TupleDataSet(env).toTable(tEnv, 'a, 'b, 'c)
> val ds2 = CollectionDataSets.get5TupleDataSet(env).toTable(tEnv, 'd, 'e, 'f, 'g, 'h)
> val joinT1 = ds1.join(ds2).where('b === 'e).select('c, 'g)
> val joinT2 = ds1.join(ds2).where("b = e").select("c, g")
> val lPlan1 = joinT1.logicalPlan
> val lPlan2 = joinT2.logicalPlan
> Assert.assertEquals("Logical Plans do not match", lPlan1, lPlan2)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)