You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/07/15 15:35:18 UTC

[GitHub] [flink-ml] taosiyuan163 opened a new pull request, #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

taosiyuan163 opened a new pull request, #132:
URL: https://github.com/apache/flink-ml/pull/132

   ### What is the purpose of the change
   
   Add Chi-squared test as Transformer to ml.feature.
   
   The chi-square statistic is a useful tool for understanding the relationship between two categorical variables ,gives us a way to quantify and assess the strength of a given pair of categorical variables.
   
   Pearson's chi-squared test:https://en.wikipedia.org/wiki/Pearson%27s_chi-squared_test
   For more information on chi-squared:http://en.wikipedia.org/wiki/Chi-squared_test
   
   ### Brief change log
   Add Chi-squared test as Transformer to ml.feature.
   
   ### Verifying this change
   The changes are tested by unit tests in ChiSqTestTransformerTest.
   
   ### Does this pull request potentially affect one of the following parts:
   Dependencies (does it add or upgrade a dependency): (no)
   The public API, i.e., is any changed class annotated with @public(Evolving): (no)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on PR #132:
URL: https://github.com/apache/flink-ml/pull/132#issuecomment-1187241699

   > Hi @taosiyuan163 , thanks for contributing to Flink ML. Could you please verify your code by running `mvn clean package` command in the root folder? It will check the formats and run all the tests. I can see that there are still errors after cloning your repository and running the command above.
   
   Hi @yunfengzhou-hub, thanks for your comments. I formatted the code according to your suggestion and strictly followed Flink's code style.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r933040819


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =

Review Comment:
   I think this is hard to avoid `parallellism=1`, because the cardinality of label is unpredictable,it will eventually aggregate on a node to distinguish the label.However, we can try to reduce the amount of data on the `Map`side.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930902953


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";

Review Comment:
   > Could `bcCategoricalMarginsKey` and `bcLabelMarginsKey` be two local variables rather than class variables?
   
   I have put them in `transform`.
   
   



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";

Review Comment:
   > Could `bcCategoricalMarginsKey` and `bcLabelMarginsKey` be two local variables rather than class variables?
   
   I have put them in `transform`.
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930911825


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        // return a DataStream of the marginal sums of the factors
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                DataStreamUtils.mapPartition(
+                        observedFreq.keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode()),
+                        new MapPartitionFunction<
+                                Tuple4<String, Object, Object, Long>,
+                                Tuple3<String, Object, Long>>() {
+                            @Override
+                            public void mapPartition(
+                                    Iterable<Tuple4<String, Object, Object, Long>> iterable,
+                                    Collector<Tuple3<String, Object, Long>> out) {
+                                HashMap<Tuple2<String, Object>, Long> map = new HashMap<>();
+
+                                for (Tuple4<String, Object, Object, Long> tuple : iterable) {
+                                    Long observedFreq = tuple.f3;
+                                    Tuple2<String, Object> key = new Tuple2<>(tuple.f0, tuple.f1);
+
+                                    if (map.containsKey(key)) {
+                                        Long count = map.get(key);
+                                        map.put(key, count + observedFreq);
+                                    } else {
+                                        map.put(key, observedFreq);
+                                    }
+                                }
+
+                                for (Tuple2<String, Object> key : map.keySet()) {
+                                    Long categoricalMargin = map.get(key);
+                                    out.collect(new Tuple3<>(key.f0, key.f1, categoricalMargin));
+                                }
+                            }
+                        });
+
+        // return a DataStream of the marginal sums of the labels
+        DataStream<Tuple3<String, Object, Long>> labelMargins =

Review Comment:
   As above, I used transform() and BoundedOneInput#endInput() to optimize the implementation.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r934409257


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,654 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square Test is an AlgoOperator that computes the statistics of independence of variables

Review Comment:
   Thanks for your comments. This statistical result yields three measures: chi-square statistic, p-value, and DOF.How about update the java doc as:
   
   Chi-square Test computes the statistics of independence of variables in a contingency table, e.g., chi-square statistic, p-value, and DOF(number of degrees of freedom) for each input feature. The contingency table is constructed from the observed categorical values.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930897436


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes

Review Comment:
   > Could you reformat the Java doc as `An AlgOperator that...` such that the java doc could be consistent with existing implememtations?
   > 
   > Also `This Transformer` -> `This AlgoOperator`
   
   I have added to the header of the document.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] yunfengzhou-hub commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
yunfengzhou-hub commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r932927887


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,655 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square test of independence of variables in a contingency table. This function computes
+ * the chi-square statistic and p-value and dof(number of degrees of freedom) for every feature in
+ * the contingency table. The contingency table is constructed from the observed of categorical
+ * values.

Review Comment:
   nit: There are some grammar errors in this paragraph. How about refactoring it as follows:
   
   Chi-square Test is an AlgoOperator that computes the statistics of independence of variables in a contingency table. This function computes the chi-square statistic, p-value, and DOF(number of degrees of freedom) for every feature in the contingency table. The contingency table is constructed from the observed categorical values.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] yunfengzhou-hub commented on a diff in pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
yunfengzhou-hub commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r927242715


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label

Review Comment:
   nit: there are some grammar errors in these documents. For example, "which constructed" should be "which is constructed, "statistic,p-value" should be "statistic, p-value".



##########
flink-ml-lib/src/test/java/org/apache/flink/ml/stats/ChiSqTestTest.java:
##########
@@ -0,0 +1,205 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats;
+
+import org.apache.flink.api.common.restartstrategy.RestartStrategies;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.ml.stats.chisqtest.ChiSqTest;
+import org.apache.flink.ml.util.TestUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.environment.ExecutionCheckpointingOptions;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.types.Row;
+
+import org.apache.commons.collections.IteratorUtils;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import java.util.Arrays;
+import java.util.List;
+
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+
+/** Tests {@link ChiSqTestTest}. */
+public class ChiSqTestTest extends AbstractTestBase {
+    @Rule public final TemporaryFolder tempFolder = new TemporaryFolder();
+    private StreamTableEnvironment tEnv;
+    private Table inputTableWithDoubleLabel;
+    private Table inputTableWithIntegerLabel;
+    private Table inputTableWithStringLabel;
+
+    private final List<Row> samplesWithDoubleLabel =
+            Arrays.asList(
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(2., 5, 1.),
+                    Row.of(0., 5, 2.),
+                    Row.of(0., 5, 2.),
+                    Row.of(1., 9, 4.),
+                    Row.of(1., 9, 3.));
+
+    private final List<Row> expectedChiSqTestResultWithDoubleLabel =
+            Arrays.asList(
+                    Row.of("f1", 0.03419350755, 13.61904761905, 6),
+                    Row.of("f2", 0.24220177737, 7.94444444444, 6));
+
+    private final List<Row> samplesWithIntegerLabel =
+            Arrays.asList(
+                    Row.of(33, 5, "a"),
+                    Row.of(44, 6, "b"),
+                    Row.of(55, 7, "b"),
+                    Row.of(11, 5, "b"),
+                    Row.of(11, 5, "a"),
+                    Row.of(33, 6, "c"),
+                    Row.of(22, 7, "c"),
+                    Row.of(66, 5, "d"),
+                    Row.of(77, 5, "d"),
+                    Row.of(88, 5, "f"),
+                    Row.of(77, 5, "h"),
+                    Row.of(44, 9, "h"),
+                    Row.of(11, 9, "j"));
+
+    private final List<Row> expectedChiSqTestResultWithIntegerLabel =
+            Arrays.asList(
+                    Row.of("f1", 0.35745138256, 22.75, 21),
+                    Row.of("f2", 0.39934987096, 43.69444444444, 42));
+
+    private final List<Row> samplesWithStringLabel =
+            Arrays.asList(
+                    Row.of("v1", 11, 21.22),
+                    Row.of("v1", 33, 22.33),
+                    Row.of("v2", 22, 32.44),
+                    Row.of("v3", 11, 54.22),
+                    Row.of("v3", 33, 22.22),
+                    Row.of("v2", 22, 22.22),
+                    Row.of("v4", 55, 22.22),
+                    Row.of("v5", 11, 41.11),
+                    Row.of("v6", 55, 14.41),
+                    Row.of("v7", 13, 25.55),
+                    Row.of("v8", 14, 25.55),
+                    Row.of("v9", 14, 44.44),
+                    Row.of("v9", 14, 31.11));
+
+    private final List<Row> expectedChiSqTestResultWithStringLabel =
+            Arrays.asList(
+                    Row.of("f1", 0.06672255089, 54.16666666667, 40),
+                    Row.of("f2", 0.42335512313, 73.66666666667, 72));
+
+    @Before
+    public void before() {
+        Configuration config = new Configuration();
+        config.set(ExecutionCheckpointingOptions.ENABLE_CHECKPOINTS_AFTER_TASKS_FINISH, true);
+        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(config);
+        env.setParallelism(4);
+        env.enableCheckpointing(100);
+        env.setRestartStrategy(RestartStrategies.noRestart());
+        tEnv = StreamTableEnvironment.create(env);
+        inputTableWithDoubleLabel =
+                tEnv.fromDataStream(env.fromCollection(samplesWithDoubleLabel))
+                        .as("label", "f1", "f2");
+        inputTableWithIntegerLabel =
+                tEnv.fromDataStream(env.fromCollection(samplesWithIntegerLabel))
+                        .as("label", "f1", "f2");
+        inputTableWithStringLabel =
+                tEnv.fromDataStream(env.fromCollection(samplesWithStringLabel))
+                        .as("label", "f1", "f2");
+    }
+
+    private static void verifyPredictionResult(Table output, List<Row> expected) throws Exception {
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) output).getTableEnvironment();
+        DataStream<Row> outputDataStream = tEnv.toDataStream(output);
+
+        List<Row> result = IteratorUtils.toList(outputDataStream.executeAndCollect());
+
+        compareResultCollections(
+                expected,
+                result,
+                (row1, row2) -> {
+                    if (!row1.equals(row2)) {
+                        return 1;
+                    } else {
+                        return 0;
+                    }
+                });
+    }
+
+    @Test
+    public void testParam() {
+        ChiSqTest chiSqTest = new ChiSqTest();

Review Comment:
   nit: we may also check the default value of `labelCol` here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] yunfengzhou-hub commented on pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
yunfengzhou-hub commented on PR #132:
URL: https://github.com/apache/flink-ml/pull/132#issuecomment-1186670613

   Hi @taosiyuan163 , thanks for contributing to Flink ML. Could you please verify your code by running `mvn clean package` command in the root folder? It will check the formats and run all the tests. I can see that there are still errors after cloning your repository and running the command above.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r925571012


##########
flink-ml-lib/src/test/java/org/apache/flink/ml/feature/ChiSqTestTransformerTest.java:
##########
@@ -0,0 +1,144 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.feature;
+
+import org.apache.flink.api.common.restartstrategy.RestartStrategies;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.ml.feature.chisqtest.ChiSqTestTransformer;
+import org.apache.flink.ml.util.TestUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.environment.ExecutionCheckpointingOptions;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.types.Row;
+
+import org.apache.commons.collections.IteratorUtils;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import java.util.Arrays;
+import java.util.List;
+
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+
+/** Tests {@link ChiSqTestTransformerTest}. */
+public class ChiSqTestTransformerTest extends AbstractTestBase {
+    @Rule public final TemporaryFolder tempFolder = new TemporaryFolder();
+    private StreamTableEnvironment tEnv;
+    private Table inputTable;
+
+    private final List<Row> samples =
+            Arrays.asList(
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(2., 5, 1.),
+                    Row.of(0., 5, 2.),
+                    Row.of(0., 5, 2.),
+                    Row.of(1., 9, 4.),
+                    Row.of(1., 9, 3.));

Review Comment:
   Hi @yunfengzhou-hub  ,I adjusted the following according to your suggestions:
   
   - Support string and numerical type for label
   
   - `ChiSqTest `inherits a new `ChiSqTestParams `class instead of inherits the `WithParams `interface directly
   
   - `ChiSqTest` and `ChiSqTestTest `are moved to the new stats package
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930906015


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";

Review Comment:
   > Could `bcCategoricalMarginsKey` and `bcLabelMarginsKey` be two local variables rather than class variables?
   
   I have put them in `transform`. 
   
   



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";

Review Comment:
   > Could `bcCategoricalMarginsKey` and `bcLabelMarginsKey` be two local variables rather than class variables?
   
   I have put them in `transform`.
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] zhipeng93 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
zhipeng93 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r932805286


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =

Review Comment:
   Thanks for the explanation.
   
   For 1: I agree that `GenerateObservedFrequencies` will reduce most of the input data in most cases, but there exist cases that there are many `zeros` in one column, e.g, there are many different labels but data is sparse. 
   However, postponing the `FillZeroFunc` should always outperforms the current implementation.
   
   For 2, I think we will not output `FilledObservedFreq` in `ChiSqTest`. We should be careful about the behavior of a well-known algorithm --- `ChiSqTest` does  not output the contingency table. If we need to output the contingency table, we may have a new algorithm for that.
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] zhipeng93 merged pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
zhipeng93 merged PR #132:
URL: https://github.com/apache/flink-ml/pull/132


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930911037


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        // return a DataStream of the marginal sums of the factors
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                DataStreamUtils.mapPartition(

Review Comment:
   I used `transform()` and `BoundedOneInput#endInput()` to optimize the implementation



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] zhipeng93 commented on a diff in pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
zhipeng93 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r928507111


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();

Review Comment:
   Could we check the size of the `inputs` here?



##########
flink-ml-lib/src/test/java/org/apache/flink/ml/stats/ChiSqTestTest.java:
##########
@@ -0,0 +1,205 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats;
+
+import org.apache.flink.api.common.restartstrategy.RestartStrategies;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.ml.stats.chisqtest.ChiSqTest;
+import org.apache.flink.ml.util.TestUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.environment.ExecutionCheckpointingOptions;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.types.Row;
+
+import org.apache.commons.collections.IteratorUtils;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import java.util.Arrays;
+import java.util.List;
+
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+
+/** Tests {@link ChiSqTestTest}. */

Review Comment:
   nit: should be `Tests {@link ChiSqlTest}.`



##########
flink-ml-lib/src/test/java/org/apache/flink/ml/stats/ChiSqTestTest.java:
##########
@@ -0,0 +1,205 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats;
+
+import org.apache.flink.api.common.restartstrategy.RestartStrategies;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.ml.stats.chisqtest.ChiSqTest;
+import org.apache.flink.ml.util.TestUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.environment.ExecutionCheckpointingOptions;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.types.Row;
+
+import org.apache.commons.collections.IteratorUtils;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import java.util.Arrays;
+import java.util.List;
+
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+
+/** Tests {@link ChiSqTestTest}. */
+public class ChiSqTestTest extends AbstractTestBase {
+    @Rule public final TemporaryFolder tempFolder = new TemporaryFolder();
+    private StreamTableEnvironment tEnv;
+    private Table inputTableWithDoubleLabel;
+    private Table inputTableWithIntegerLabel;
+    private Table inputTableWithStringLabel;
+
+    private final List<Row> samplesWithDoubleLabel =
+            Arrays.asList(
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(2., 5, 1.),
+                    Row.of(0., 5, 2.),
+                    Row.of(0., 5, 2.),
+                    Row.of(1., 9, 4.),
+                    Row.of(1., 9, 3.));
+
+    private final List<Row> expectedChiSqTestResultWithDoubleLabel =
+            Arrays.asList(
+                    Row.of("f1", 0.03419350755, 13.61904761905, 6),
+                    Row.of("f2", 0.24220177737, 7.94444444444, 6));
+
+    private final List<Row> samplesWithIntegerLabel =
+            Arrays.asList(
+                    Row.of(33, 5, "a"),
+                    Row.of(44, 6, "b"),
+                    Row.of(55, 7, "b"),
+                    Row.of(11, 5, "b"),
+                    Row.of(11, 5, "a"),
+                    Row.of(33, 6, "c"),
+                    Row.of(22, 7, "c"),
+                    Row.of(66, 5, "d"),
+                    Row.of(77, 5, "d"),
+                    Row.of(88, 5, "f"),
+                    Row.of(77, 5, "h"),
+                    Row.of(44, 9, "h"),
+                    Row.of(11, 9, "j"));
+
+    private final List<Row> expectedChiSqTestResultWithIntegerLabel =
+            Arrays.asList(
+                    Row.of("f1", 0.35745138256, 22.75, 21),
+                    Row.of("f2", 0.39934987096, 43.69444444444, 42));
+
+    private final List<Row> samplesWithStringLabel =
+            Arrays.asList(
+                    Row.of("v1", 11, 21.22),
+                    Row.of("v1", 33, 22.33),
+                    Row.of("v2", 22, 32.44),
+                    Row.of("v3", 11, 54.22),
+                    Row.of("v3", 33, 22.22),
+                    Row.of("v2", 22, 22.22),
+                    Row.of("v4", 55, 22.22),
+                    Row.of("v5", 11, 41.11),
+                    Row.of("v6", 55, 14.41),
+                    Row.of("v7", 13, 25.55),
+                    Row.of("v8", 14, 25.55),
+                    Row.of("v9", 14, 44.44),
+                    Row.of("v9", 14, 31.11));
+
+    private final List<Row> expectedChiSqTestResultWithStringLabel =
+            Arrays.asList(
+                    Row.of("f1", 0.06672255089, 54.16666666667, 40),
+                    Row.of("f2", 0.42335512313, 73.66666666667, 72));
+
+    @Before
+    public void before() {
+        Configuration config = new Configuration();
+        config.set(ExecutionCheckpointingOptions.ENABLE_CHECKPOINTS_AFTER_TASKS_FINISH, true);
+        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(config);
+        env.setParallelism(4);
+        env.enableCheckpointing(100);
+        env.setRestartStrategy(RestartStrategies.noRestart());
+        tEnv = StreamTableEnvironment.create(env);
+        inputTableWithDoubleLabel =
+                tEnv.fromDataStream(env.fromCollection(samplesWithDoubleLabel))
+                        .as("label", "f1", "f2");
+        inputTableWithIntegerLabel =
+                tEnv.fromDataStream(env.fromCollection(samplesWithIntegerLabel))
+                        .as("label", "f1", "f2");
+        inputTableWithStringLabel =
+                tEnv.fromDataStream(env.fromCollection(samplesWithStringLabel))
+                        .as("label", "f1", "f2");
+    }
+
+    private static void verifyPredictionResult(Table output, List<Row> expected) throws Exception {
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) output).getTableEnvironment();
+        DataStream<Row> outputDataStream = tEnv.toDataStream(output);
+
+        List<Row> result = IteratorUtils.toList(outputDataStream.executeAndCollect());
+
+        compareResultCollections(
+                expected,
+                result,
+                (row1, row2) -> {
+                    if (!row1.equals(row2)) {
+                        return 1;
+                    } else {
+                        return 0;
+                    }
+                });
+    }
+
+    @Test
+    public void testParam() {
+        ChiSqTest chiSqTest = new ChiSqTest();
+
+        chiSqTest.setInputCols("f1", "f2").setLabelCol("label");
+
+        assertArrayEquals(new String[] {"f1", "f2"}, chiSqTest.getInputCols());
+        assertEquals("label", chiSqTest.getLabelCol());
+    }
+
+    @Test
+    public void testOutputSchema() {
+        ChiSqTest chiSqTest = new ChiSqTest().setInputCols("f1", "f2").setLabelCol("label");
+
+        Table output = chiSqTest.transform(inputTableWithDoubleLabel)[0];
+        assertEquals(
+                Arrays.asList("column", "pValue", "statistic", "degreesOfFreedom"),
+                output.getResolvedSchema().getColumnNames());
+    }
+
+    @Test
+    public void testTransform() throws Exception {
+        ChiSqTest chiSqTest = new ChiSqTest().setInputCols("f1", "f2").setLabelCol("label");
+
+        Table output1 = chiSqTest.transform(inputTableWithDoubleLabel)[0];
+        verifyPredictionResult(output1, expectedChiSqTestResultWithDoubleLabel);
+
+        Table output2 = chiSqTest.transform(inputTableWithIntegerLabel)[0];
+        verifyPredictionResult(output2, expectedChiSqTestResultWithIntegerLabel);
+
+        Table output3 = chiSqTest.transform(inputTableWithStringLabel)[0];
+        verifyPredictionResult(output3, expectedChiSqTestResultWithStringLabel);
+    }
+
+    @Test
+    public void testSaveLoadAndTransform() throws Exception {
+        ChiSqTest chiSqTest = new ChiSqTest().setInputCols("f1", "f2").setLabelCol("label");
+
+        ChiSqTest loadedBucketizer =

Review Comment:
   nit: `loadedBucketizer` to `loadedChiSqTest`



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        // return a DataStream of the marginal sums of the factors
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                DataStreamUtils.mapPartition(
+                        observedFreq.keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode()),
+                        new MapPartitionFunction<
+                                Tuple4<String, Object, Object, Long>,
+                                Tuple3<String, Object, Long>>() {
+                            @Override
+                            public void mapPartition(
+                                    Iterable<Tuple4<String, Object, Object, Long>> iterable,
+                                    Collector<Tuple3<String, Object, Long>> out) {
+                                HashMap<Tuple2<String, Object>, Long> map = new HashMap<>();
+
+                                for (Tuple4<String, Object, Object, Long> tuple : iterable) {
+                                    Long observedFreq = tuple.f3;
+                                    Tuple2<String, Object> key = new Tuple2<>(tuple.f0, tuple.f1);
+
+                                    if (map.containsKey(key)) {
+                                        Long count = map.get(key);
+                                        map.put(key, count + observedFreq);
+                                    } else {
+                                        map.put(key, observedFreq);
+                                    }
+                                }
+
+                                for (Tuple2<String, Object> key : map.keySet()) {
+                                    Long categoricalMargin = map.get(key);
+                                    out.collect(new Tuple3<>(key.f0, key.f1, categoricalMargin));
+                                }
+                            }
+                        });
+
+        // return a DataStream of the marginal sums of the labels
+        DataStream<Tuple3<String, Object, Long>> labelMargins =

Review Comment:
   Same as above.



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes

Review Comment:
   Could you reformat the Java doc as `An AlgOperator that...` such that the java doc could be consistent with existing implememtations?
   
   Also `This Transformer` -> `This AlgoOperator`



##########
flink-ml-lib/src/test/java/org/apache/flink/ml/stats/ChiSqTestTest.java:
##########
@@ -0,0 +1,205 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats;
+
+import org.apache.flink.api.common.restartstrategy.RestartStrategies;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.ml.stats.chisqtest.ChiSqTest;
+import org.apache.flink.ml.util.TestUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.environment.ExecutionCheckpointingOptions;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.types.Row;
+
+import org.apache.commons.collections.IteratorUtils;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import java.util.Arrays;
+import java.util.List;
+
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+
+/** Tests {@link ChiSqTestTest}. */
+public class ChiSqTestTest extends AbstractTestBase {
+    @Rule public final TemporaryFolder tempFolder = new TemporaryFolder();
+    private StreamTableEnvironment tEnv;
+    private Table inputTableWithDoubleLabel;
+    private Table inputTableWithIntegerLabel;
+    private Table inputTableWithStringLabel;
+
+    private final List<Row> samplesWithDoubleLabel =
+            Arrays.asList(
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(2., 5, 1.),
+                    Row.of(0., 5, 2.),
+                    Row.of(0., 5, 2.),
+                    Row.of(1., 9, 4.),
+                    Row.of(1., 9, 3.));
+
+    private final List<Row> expectedChiSqTestResultWithDoubleLabel =
+            Arrays.asList(
+                    Row.of("f1", 0.03419350755, 13.61904761905, 6),
+                    Row.of("f2", 0.24220177737, 7.94444444444, 6));
+
+    private final List<Row> samplesWithIntegerLabel =
+            Arrays.asList(
+                    Row.of(33, 5, "a"),
+                    Row.of(44, 6, "b"),
+                    Row.of(55, 7, "b"),
+                    Row.of(11, 5, "b"),
+                    Row.of(11, 5, "a"),
+                    Row.of(33, 6, "c"),
+                    Row.of(22, 7, "c"),
+                    Row.of(66, 5, "d"),
+                    Row.of(77, 5, "d"),
+                    Row.of(88, 5, "f"),
+                    Row.of(77, 5, "h"),
+                    Row.of(44, 9, "h"),
+                    Row.of(11, 9, "j"));
+
+    private final List<Row> expectedChiSqTestResultWithIntegerLabel =
+            Arrays.asList(
+                    Row.of("f1", 0.35745138256, 22.75, 21),
+                    Row.of("f2", 0.39934987096, 43.69444444444, 42));
+
+    private final List<Row> samplesWithStringLabel =
+            Arrays.asList(
+                    Row.of("v1", 11, 21.22),
+                    Row.of("v1", 33, 22.33),
+                    Row.of("v2", 22, 32.44),
+                    Row.of("v3", 11, 54.22),
+                    Row.of("v3", 33, 22.22),
+                    Row.of("v2", 22, 22.22),
+                    Row.of("v4", 55, 22.22),
+                    Row.of("v5", 11, 41.11),
+                    Row.of("v6", 55, 14.41),
+                    Row.of("v7", 13, 25.55),
+                    Row.of("v8", 14, 25.55),
+                    Row.of("v9", 14, 44.44),
+                    Row.of("v9", 14, 31.11));
+
+    private final List<Row> expectedChiSqTestResultWithStringLabel =
+            Arrays.asList(
+                    Row.of("f1", 0.06672255089, 54.16666666667, 40),
+                    Row.of("f2", 0.42335512313, 73.66666666667, 72));
+
+    @Before
+    public void before() {
+        Configuration config = new Configuration();
+        config.set(ExecutionCheckpointingOptions.ENABLE_CHECKPOINTS_AFTER_TASKS_FINISH, true);
+        StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(config);
+        env.setParallelism(4);
+        env.enableCheckpointing(100);
+        env.setRestartStrategy(RestartStrategies.noRestart());
+        tEnv = StreamTableEnvironment.create(env);
+        inputTableWithDoubleLabel =
+                tEnv.fromDataStream(env.fromCollection(samplesWithDoubleLabel))
+                        .as("label", "f1", "f2");
+        inputTableWithIntegerLabel =
+                tEnv.fromDataStream(env.fromCollection(samplesWithIntegerLabel))
+                        .as("label", "f1", "f2");
+        inputTableWithStringLabel =
+                tEnv.fromDataStream(env.fromCollection(samplesWithStringLabel))
+                        .as("label", "f1", "f2");
+    }
+
+    private static void verifyPredictionResult(Table output, List<Row> expected) throws Exception {
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) output).getTableEnvironment();
+        DataStream<Row> outputDataStream = tEnv.toDataStream(output);
+
+        List<Row> result = IteratorUtils.toList(outputDataStream.executeAndCollect());
+
+        compareResultCollections(
+                expected,
+                result,
+                (row1, row2) -> {
+                    if (!row1.equals(row2)) {
+                        return 1;
+                    } else {
+                        return 0;
+                    }
+                });
+    }
+
+    @Test
+    public void testParam() {
+        ChiSqTest chiSqTest = new ChiSqTest();
+
+        chiSqTest.setInputCols("f1", "f2").setLabelCol("label");
+
+        assertArrayEquals(new String[] {"f1", "f2"}, chiSqTest.getInputCols());
+        assertEquals("label", chiSqTest.getLabelCol());
+    }
+
+    @Test
+    public void testOutputSchema() {
+        ChiSqTest chiSqTest = new ChiSqTest().setInputCols("f1", "f2").setLabelCol("label");
+
+        Table output = chiSqTest.transform(inputTableWithDoubleLabel)[0];
+        assertEquals(
+                Arrays.asList("column", "pValue", "statistic", "degreesOfFreedom"),
+                output.getResolvedSchema().getColumnNames());
+    }
+
+    @Test
+    public void testTransform() throws Exception {
+        ChiSqTest chiSqTest = new ChiSqTest().setInputCols("f1", "f2").setLabelCol("label");
+
+        Table output1 = chiSqTest.transform(inputTableWithDoubleLabel)[0];
+        verifyPredictionResult(output1, expectedChiSqTestResultWithDoubleLabel);
+
+        Table output2 = chiSqTest.transform(inputTableWithIntegerLabel)[0];
+        verifyPredictionResult(output2, expectedChiSqTestResultWithIntegerLabel);
+
+        Table output3 = chiSqTest.transform(inputTableWithStringLabel)[0];
+        verifyPredictionResult(output3, expectedChiSqTestResultWithStringLabel);
+    }
+
+    @Test
+    public void testSaveLoadAndTransform() throws Exception {
+        ChiSqTest chiSqTest = new ChiSqTest().setInputCols("f1", "f2").setLabelCol("label");
+
+        ChiSqTest loadedBucketizer =
+                TestUtils.saveAndReload(tEnv, chiSqTest, tempFolder.newFolder().getAbsolutePath());
+        Table output1 = loadedBucketizer.transform(inputTableWithDoubleLabel)[0];
+        verifyPredictionResult(output1, expectedChiSqTestResultWithDoubleLabel);
+
+        Table output2 = loadedBucketizer.transform(inputTableWithIntegerLabel)[0];

Review Comment:
   nit: I think one input for `testSaveLoadAndTransform` is enough since different inputs are already tested in `testTransform`.



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";

Review Comment:
   Could `bcCategoricalMarginsKey` and  `bcLabelMarginsKey` be two local variables rather than class variables?



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        // return a DataStream of the marginal sums of the factors
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                DataStreamUtils.mapPartition(

Review Comment:
   How about using `keyBy + reduce`?  We only use `mapPartitions` when necessary.



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());

Review Comment:
   Given that we only need to access each input Tuple once, how about we use `transform()` and `BoundedOneInput#endInput()` to optimize the implementation following [1] here?
   
   Note that `mapPartition` caches the data in memory/disk and is often expensive. We could have a more efficient implementation here.
   
   
   [1] https://github.com/apache/flink-ml/blob/f9f802125d604f0155221804237fd4140e239602/flink-ml-lib/src/main/java/org/apache/flink/ml/feature/stringindexer/StringIndexer.java#L140



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =

Review Comment:
   How about we just compute the `distinct labels` and postpone the `fill` operation to Line#199, e.g., `DataStream<Tuple3<String, Double, Integer>> categoricalStatistics =...`?
   
   Using `parallellism=1` for computing all data is not efficient usually.



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        // return a DataStream of the marginal sums of the factors
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                DataStreamUtils.mapPartition(
+                        observedFreq.keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode()),
+                        new MapPartitionFunction<
+                                Tuple4<String, Object, Object, Long>,
+                                Tuple3<String, Object, Long>>() {
+                            @Override
+                            public void mapPartition(
+                                    Iterable<Tuple4<String, Object, Object, Long>> iterable,
+                                    Collector<Tuple3<String, Object, Long>> out) {
+                                HashMap<Tuple2<String, Object>, Long> map = new HashMap<>();
+
+                                for (Tuple4<String, Object, Object, Long> tuple : iterable) {
+                                    Long observedFreq = tuple.f3;
+                                    Tuple2<String, Object> key = new Tuple2<>(tuple.f0, tuple.f1);
+
+                                    if (map.containsKey(key)) {
+                                        Long count = map.get(key);
+                                        map.put(key, count + observedFreq);
+                                    } else {
+                                        map.put(key, observedFreq);
+                                    }
+                                }
+
+                                for (Tuple2<String, Object> key : map.keySet()) {
+                                    Long categoricalMargin = map.get(key);
+                                    out.collect(new Tuple3<>(key.f0, key.f1, categoricalMargin));
+                                }
+                            }
+                        });
+
+        // return a DataStream of the marginal sums of the labels
+        DataStream<Tuple3<String, Object, Long>> labelMargins =
+                DataStreamUtils.mapPartition(
+                        observedFreq.keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f2).hashCode()),
+                        new MapPartitionFunction<
+                                Tuple4<String, Object, Object, Long>,
+                                Tuple3<String, Object, Long>>() {
+                            @Override
+                            public void mapPartition(
+                                    Iterable<Tuple4<String, Object, Object, Long>> iterable,
+                                    Collector<Tuple3<String, Object, Long>> out) {
+
+                                HashMap<Tuple2<String, Object>, Long> map = new HashMap<>();
+
+                                for (Tuple4<String, Object, Object, Long>
+                                        colAndFeatureAndLabelAndCount : iterable) {
+                                    Long observedFreq = colAndFeatureAndLabelAndCount.f3;
+                                    Tuple2<String, Object> key =
+                                            new Tuple2<>(
+                                                    colAndFeatureAndLabelAndCount.f0,
+                                                    colAndFeatureAndLabelAndCount.f2);
+
+                                    if (map.containsKey(key)) {
+                                        Long count = map.get(key);
+                                        map.put(key, count + observedFreq);
+                                    } else {
+                                        map.put(key, observedFreq);
+                                    }
+                                }
+
+                                for (Tuple2<String, Object> key : map.keySet()) {
+                                    Long labelMargin = map.get(key);
+                                    out.collect(new Tuple3<>(key.f0, key.f1, labelMargin));
+                                }
+                            }
+                        });
+
+        Function<List<DataStream<?>>, DataStream<Tuple3<String, Double, Integer>>> function =
+                dataStreams -> {
+                    DataStream stream = dataStreams.get(0);
+                    return stream.map(new ChiSqFunc(bcCategoricalMarginsKey, bcLabelMarginsKey));
+                };
+
+        HashMap<String, DataStream<?>> bcMap =
+                new HashMap<String, DataStream<?>>() {
+                    {
+                        put(bcCategoricalMarginsKey, categoricalMargins);
+                        put(bcLabelMarginsKey, labelMargins);
+                    }
+                };
+
+        DataStream<Tuple3<String, Double, Integer>> categoricalStatistics =
+                BroadcastUtils.withBroadcastStream(
+                        Collections.singletonList(filledObservedFreq), bcMap, function);
+
+        SingleOutputStreamOperator<Row> chiSqTestResult =
+                categoricalStatistics
+                        .transform(
+                                "chiSqTestResult",
+                                new RowTypeInfo(
+                                        new TypeInformation[] {
+                                            Types.STRING, Types.DOUBLE, Types.DOUBLE, Types.INT
+                                        },
+                                        new String[] {
+                                            "column", "pValue", "statistic", "degreesOfFreedom"
+                                        }),
+                                new AggregateChiSqFunc())
+                        .setParallelism(1);
+
+        return new Table[] {tEnv.fromDataStream(chiSqTestResult)};
+    }
+
+    @Override
+    public void save(String path) throws IOException {
+        ReadWriteUtils.saveMetadata(this, path);
+    }
+
+    public static ChiSqTest load(StreamTableEnvironment tEnv, String path) throws IOException {
+        return ReadWriteUtils.loadStageParam(path);
+    }
+
+    @Override
+    public Map<Param<?>, Object> getParamMap() {
+        return paramMap;
+    }
+
+    private static class ExtractColAndFeatureAndLabel
+            extends RichFlatMapFunction<Row, Tuple3<String, Object, Object>> {
+        private final String[] inputCols;
+        private final String labelCol;
+
+        public ExtractColAndFeatureAndLabel(String[] inputCols, String labelCol) {
+            this.inputCols = inputCols;
+            this.labelCol = labelCol;
+        }
+
+        @Override
+        public void flatMap(Row row, Collector<Tuple3<String, Object, Object>> collector) {
+
+            Object label = row.getFieldAs(labelCol);
+
+            for (String colName : inputCols) {
+                Object value = row.getField(colName);
+                collector.collect(new Tuple3<>(colName, value, label));
+            }
+        }
+    }
+
+    /**
+     * Compute a frequency table(DataStream) of the factors(categorical values). The returned
+     * DataStream contains the observed frequencies (i.e. number of occurrences) in each category.
+     */
+    private static class GenerateObservedFrequencies
+            extends RichMapPartitionFunction<
+                    Tuple3<String, Object, Object>, Tuple4<String, Object, Object, Long>> {
+
+        @Override
+        public void mapPartition(
+                Iterable<Tuple3<String, Object, Object>> iterable,
+                Collector<Tuple4<String, Object, Object, Long>> out) {
+
+            HashMap<Tuple3<String, Object, Object>, Long> map = new HashMap<>();
+
+            for (Tuple3<String, Object, Object> key : iterable) {
+                if (map.containsKey(key)) {
+                    Long count = map.get(key);
+                    map.put(key, count + 1);
+                } else {
+                    map.put(key, 1L);
+                }
+            }
+
+            for (Tuple3<String, Object, Object> key : map.keySet()) {
+                Long count = map.get(key);
+                out.collect(new Tuple4<>(key.f0, key.f1, key.f2, count));
+            }
+        }
+    }
+
+    /** Fill the factors which frequencies are zero in frequency table. */
+    private static class FillZeroFunc
+            extends AbstractStreamOperator<Tuple4<String, Object, Object, Long>>
+            implements OneInputStreamOperator<
+                            Tuple4<String, Object, Object, Long>,
+                            Tuple4<String, Object, Object, Long>>,
+                    BoundedOneInput {
+
+        HashMap<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>> values = new HashMap<>();
+        HashSet<Object> numLabels = new HashSet<>();
+
+        @Override
+        public void endInput() {
+
+            for (Map.Entry<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>> entry :
+                    values.entrySet()) {
+                ArrayList<Tuple2<Object, Long>> labelAndCountList = entry.getValue();
+                Tuple2<String, Object> categoricalKey = entry.getKey();
+
+                List<Object> existingLabels =
+                        labelAndCountList.stream().map(v -> v.f0).collect(Collectors.toList());
+
+                for (Object label : numLabels) {
+                    if (!existingLabels.contains(label)) {
+                        Tuple2<Object, Long> generatedLabelCount = new Tuple2<>(label, 0L);
+                        labelAndCountList.add(generatedLabelCount);
+                    }
+                }
+
+                for (Tuple2<Object, Long> labelAndCount : labelAndCountList) {
+                    output.collect(
+                            new StreamRecord<>(
+                                    new Tuple4<>(
+                                            categoricalKey.f0,
+                                            categoricalKey.f1,
+                                            labelAndCount.f0,
+                                            labelAndCount.f1)));
+                }
+            }
+        }
+
+        @Override
+        public void processElement(StreamRecord<Tuple4<String, Object, Object, Long>> element) {
+            Tuple4<String, Object, Object, Long> colAndCategoryAndLabelAndCount =
+                    element.getValue();
+            Tuple2<String, Object> key =
+                    new Tuple2<>(
+                            colAndCategoryAndLabelAndCount.f0, colAndCategoryAndLabelAndCount.f1);
+            Tuple2<Object, Long> labelAndCount =
+                    new Tuple2<>(
+                            colAndCategoryAndLabelAndCount.f2, colAndCategoryAndLabelAndCount.f3);
+            ArrayList<Tuple2<Object, Long>> labelAndCountList = values.get(key);
+
+            if (labelAndCountList == null) {
+                ArrayList<Tuple2<Object, Long>> value = new ArrayList<>();
+                value.add(labelAndCount);
+                values.put(key, value);
+            } else {
+                labelAndCountList.add(labelAndCount);
+            }
+
+            numLabels.add(colAndCategoryAndLabelAndCount.f2);
+        }
+    }
+
+    /**
+     * Conduct Pearson's independence test in a contingency table that constructed from the input
+     * `observed` for each categorical values.
+     */
+    private static class ChiSqFunc
+            extends RichMapFunction<
+                    Tuple4<String, Object, Object, Long>, Tuple3<String, Double, Integer>> {
+
+        private final String bcCategoricalMarginsKey;
+        private final String bcLabelMarginsKey;
+        private final Map<Tuple2<String, Object>, Long> categoricalMargins = new HashMap<>();
+        private final Map<Tuple2<String, Object>, Long> labelMargins = new HashMap<>();
+
+        double sampleSize = 0;
+        int numLabels = 0;
+        HashMap<String, Integer> col2NumCategories = new HashMap<>();
+
+        public ChiSqFunc(String bcCategoricalMarginsKey, String bcLabelMarginsKey) {
+            this.bcCategoricalMarginsKey = bcCategoricalMarginsKey;
+            this.bcLabelMarginsKey = bcLabelMarginsKey;
+        }
+
+        @Override
+        public Tuple3<String, Double, Integer> map(Tuple4<String, Object, Object, Long> v) {
+            if (categoricalMargins.isEmpty()) {
+                List<Tuple3<String, Object, Long>> categoricalMarginList =
+                        getRuntimeContext().getBroadcastVariable(bcCategoricalMarginsKey);
+                List<Tuple3<String, Object, Long>> labelMarginList =
+                        getRuntimeContext().getBroadcastVariable(bcLabelMarginsKey);
+
+                for (Tuple3<String, Object, Long> colAndFeatureAndCount : categoricalMarginList) {
+                    String theColName = colAndFeatureAndCount.f0;
+                    col2NumCategories.merge(theColName, 1, Integer::sum);
+                }
+
+                numLabels = (int) labelMarginList.stream().map(x -> x.f1).distinct().count();
+
+                for (Tuple3<String, Object, Long> colAndFeatureAndCount : categoricalMarginList) {
+                    categoricalMargins.put(
+                            new Tuple2<>(colAndFeatureAndCount.f0, colAndFeatureAndCount.f1),
+                            colAndFeatureAndCount.f2);
+                }
+
+                Map<String, Double> sampleSizeCount = new HashMap<>();
+                String tmpKey = null;
+
+                for (Tuple3<String, Object, Long> colAndLabelAndCount : labelMarginList) {
+                    String col = colAndLabelAndCount.f0;
+
+                    if (tmpKey == null) {
+                        tmpKey = col;
+                        sampleSizeCount.put(col, 0D);
+                    }
+
+                    sampleSizeCount.computeIfPresent(
+                            col, (k, count) -> count + colAndLabelAndCount.f2);
+                    labelMargins.put(
+                            new Tuple2<>(col, colAndLabelAndCount.f1), colAndLabelAndCount.f2);
+                }
+
+                Optional<Double> sampleSizeOpt =
+                        sampleSizeCount.values().stream().reduce(Double::sum);
+                Preconditions.checkArgument(sampleSizeOpt.isPresent());
+                sampleSize = sampleSizeOpt.get();
+            }
+
+            String colName = v.f0;
+            // Degrees of freedom
+            int dof = (col2NumCategories.get(colName) - 1) * (numLabels - 1);
+
+            Tuple2<String, Object> category = new Tuple2<>(v.f0, v.f1);
+
+            Tuple2<String, Object> colAndLabelKey = new Tuple2<>(v.f0, v.f2);
+            Long theCategoricalMargin = categoricalMargins.get(category);
+            Long theLabelMargin = labelMargins.get(colAndLabelKey);
+            Long observed = v.f3;
+
+            double expected = (double) (theLabelMargin * theCategoricalMargin) / sampleSize;
+            double categoricalStatistic = pearsonFunc(observed, expected);
+
+            return new Tuple3<>(colName, categoricalStatistic, dof);
+        }
+
+        // Pearson's chi-squared test: http://en.wikipedia.org/wiki/Pearson%27s_chi-squared_test
+        private double pearsonFunc(double observed, double expected) {
+            double dev = observed - expected;
+            return dev * dev / expected;
+        }
+    }
+
+    /**
+     * This function computes the Pearson's chi-squared statistic , p-value ,and the number of
+     * degrees of freedom for every feature across the input DataStream.
+     */
+    private static class AggregateChiSqFunc extends AbstractStreamOperator<Row>
+            implements OneInputStreamOperator<Tuple3<String, Double, Integer>, Row>,
+                    BoundedOneInput {
+
+        HashMap<String, Tuple2<Double, Integer>> col2Statistic = new HashMap<>();

Review Comment:
   If you are using `col2statistics` here, we have to store it in `state` for failover.
   
   How about we replace the logic here with existing operators, such as `keyBy + reduce + map`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] yunfengzhou-hub commented on a diff in pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
yunfengzhou-hub commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r927235487


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/feature/chisqtest/ChiSqTestTransformer.java:
##########
@@ -0,0 +1,498 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.feature.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.Transformer;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.common.param.HasInputCols;
+import org.apache.flink.ml.common.param.HasLabelCol;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTestTransformer
+        implements Transformer<ChiSqTestTransformer>,

Review Comment:
   I think it is a good idea to add a `stats` package. @zhipeng93 do you think it is acceptable to add this package in Flink ML?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930018268


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =

Review Comment:
   > How about we just compute the `distinct labels` and postpone the `fill` operation to Line#199, e.g., `DataStream<Tuple3<String, Double, Integer>> categoricalStatistics =...`?
   > 
   > Using `parallellism=1` for computing all data is not efficient usually.
   
   Hi @zhipeng93 , thanks for the comments.If we need to adjust this method block,colud consider the following?
   
   1. Typically, developers use discretized data as input,therefore,`GenerateObservedFrequencies `will reduce large amounts of input data before calling `FillZeroFunc`.
   
   2. `FilledObservedFreq `represents the contingency table ,maybe it can be output as a result in the future? Refer to pandas `crosstab `in Python.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930920964


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        // return a DataStream of the marginal sums of the factors
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                DataStreamUtils.mapPartition(
+                        observedFreq.keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode()),
+                        new MapPartitionFunction<
+                                Tuple4<String, Object, Object, Long>,
+                                Tuple3<String, Object, Long>>() {
+                            @Override
+                            public void mapPartition(
+                                    Iterable<Tuple4<String, Object, Object, Long>> iterable,
+                                    Collector<Tuple3<String, Object, Long>> out) {
+                                HashMap<Tuple2<String, Object>, Long> map = new HashMap<>();
+
+                                for (Tuple4<String, Object, Object, Long> tuple : iterable) {
+                                    Long observedFreq = tuple.f3;
+                                    Tuple2<String, Object> key = new Tuple2<>(tuple.f0, tuple.f1);
+
+                                    if (map.containsKey(key)) {
+                                        Long count = map.get(key);
+                                        map.put(key, count + observedFreq);
+                                    } else {
+                                        map.put(key, observedFreq);
+                                    }
+                                }
+
+                                for (Tuple2<String, Object> key : map.keySet()) {
+                                    Long categoricalMargin = map.get(key);
+                                    out.collect(new Tuple3<>(key.f0, key.f1, categoricalMargin));
+                                }
+                            }
+                        });
+
+        // return a DataStream of the marginal sums of the labels
+        DataStream<Tuple3<String, Object, Long>> labelMargins =
+                DataStreamUtils.mapPartition(
+                        observedFreq.keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f2).hashCode()),
+                        new MapPartitionFunction<
+                                Tuple4<String, Object, Object, Long>,
+                                Tuple3<String, Object, Long>>() {
+                            @Override
+                            public void mapPartition(
+                                    Iterable<Tuple4<String, Object, Object, Long>> iterable,
+                                    Collector<Tuple3<String, Object, Long>> out) {
+
+                                HashMap<Tuple2<String, Object>, Long> map = new HashMap<>();
+
+                                for (Tuple4<String, Object, Object, Long>
+                                        colAndFeatureAndLabelAndCount : iterable) {
+                                    Long observedFreq = colAndFeatureAndLabelAndCount.f3;
+                                    Tuple2<String, Object> key =
+                                            new Tuple2<>(
+                                                    colAndFeatureAndLabelAndCount.f0,
+                                                    colAndFeatureAndLabelAndCount.f2);
+
+                                    if (map.containsKey(key)) {
+                                        Long count = map.get(key);
+                                        map.put(key, count + observedFreq);
+                                    } else {
+                                        map.put(key, observedFreq);
+                                    }
+                                }
+
+                                for (Tuple2<String, Object> key : map.keySet()) {
+                                    Long labelMargin = map.get(key);
+                                    out.collect(new Tuple3<>(key.f0, key.f1, labelMargin));
+                                }
+                            }
+                        });
+
+        Function<List<DataStream<?>>, DataStream<Tuple3<String, Double, Integer>>> function =
+                dataStreams -> {
+                    DataStream stream = dataStreams.get(0);
+                    return stream.map(new ChiSqFunc(bcCategoricalMarginsKey, bcLabelMarginsKey));
+                };
+
+        HashMap<String, DataStream<?>> bcMap =
+                new HashMap<String, DataStream<?>>() {
+                    {
+                        put(bcCategoricalMarginsKey, categoricalMargins);
+                        put(bcLabelMarginsKey, labelMargins);
+                    }
+                };
+
+        DataStream<Tuple3<String, Double, Integer>> categoricalStatistics =
+                BroadcastUtils.withBroadcastStream(
+                        Collections.singletonList(filledObservedFreq), bcMap, function);
+
+        SingleOutputStreamOperator<Row> chiSqTestResult =
+                categoricalStatistics
+                        .transform(
+                                "chiSqTestResult",
+                                new RowTypeInfo(
+                                        new TypeInformation[] {
+                                            Types.STRING, Types.DOUBLE, Types.DOUBLE, Types.INT
+                                        },
+                                        new String[] {
+                                            "column", "pValue", "statistic", "degreesOfFreedom"
+                                        }),
+                                new AggregateChiSqFunc())
+                        .setParallelism(1);
+
+        return new Table[] {tEnv.fromDataStream(chiSqTestResult)};
+    }
+
+    @Override
+    public void save(String path) throws IOException {
+        ReadWriteUtils.saveMetadata(this, path);
+    }
+
+    public static ChiSqTest load(StreamTableEnvironment tEnv, String path) throws IOException {
+        return ReadWriteUtils.loadStageParam(path);
+    }
+
+    @Override
+    public Map<Param<?>, Object> getParamMap() {
+        return paramMap;
+    }
+
+    private static class ExtractColAndFeatureAndLabel
+            extends RichFlatMapFunction<Row, Tuple3<String, Object, Object>> {
+        private final String[] inputCols;
+        private final String labelCol;
+
+        public ExtractColAndFeatureAndLabel(String[] inputCols, String labelCol) {
+            this.inputCols = inputCols;
+            this.labelCol = labelCol;
+        }
+
+        @Override
+        public void flatMap(Row row, Collector<Tuple3<String, Object, Object>> collector) {
+
+            Object label = row.getFieldAs(labelCol);
+
+            for (String colName : inputCols) {
+                Object value = row.getField(colName);
+                collector.collect(new Tuple3<>(colName, value, label));
+            }
+        }
+    }
+
+    /**
+     * Compute a frequency table(DataStream) of the factors(categorical values). The returned
+     * DataStream contains the observed frequencies (i.e. number of occurrences) in each category.
+     */
+    private static class GenerateObservedFrequencies
+            extends RichMapPartitionFunction<
+                    Tuple3<String, Object, Object>, Tuple4<String, Object, Object, Long>> {
+
+        @Override
+        public void mapPartition(
+                Iterable<Tuple3<String, Object, Object>> iterable,
+                Collector<Tuple4<String, Object, Object, Long>> out) {
+
+            HashMap<Tuple3<String, Object, Object>, Long> map = new HashMap<>();
+
+            for (Tuple3<String, Object, Object> key : iterable) {
+                if (map.containsKey(key)) {
+                    Long count = map.get(key);
+                    map.put(key, count + 1);
+                } else {
+                    map.put(key, 1L);
+                }
+            }
+
+            for (Tuple3<String, Object, Object> key : map.keySet()) {
+                Long count = map.get(key);
+                out.collect(new Tuple4<>(key.f0, key.f1, key.f2, count));
+            }
+        }
+    }
+
+    /** Fill the factors which frequencies are zero in frequency table. */
+    private static class FillZeroFunc
+            extends AbstractStreamOperator<Tuple4<String, Object, Object, Long>>
+            implements OneInputStreamOperator<
+                            Tuple4<String, Object, Object, Long>,
+                            Tuple4<String, Object, Object, Long>>,
+                    BoundedOneInput {
+
+        HashMap<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>> values = new HashMap<>();
+        HashSet<Object> numLabels = new HashSet<>();
+
+        @Override
+        public void endInput() {
+
+            for (Map.Entry<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>> entry :
+                    values.entrySet()) {
+                ArrayList<Tuple2<Object, Long>> labelAndCountList = entry.getValue();
+                Tuple2<String, Object> categoricalKey = entry.getKey();
+
+                List<Object> existingLabels =
+                        labelAndCountList.stream().map(v -> v.f0).collect(Collectors.toList());
+
+                for (Object label : numLabels) {
+                    if (!existingLabels.contains(label)) {
+                        Tuple2<Object, Long> generatedLabelCount = new Tuple2<>(label, 0L);
+                        labelAndCountList.add(generatedLabelCount);
+                    }
+                }
+
+                for (Tuple2<Object, Long> labelAndCount : labelAndCountList) {
+                    output.collect(
+                            new StreamRecord<>(
+                                    new Tuple4<>(
+                                            categoricalKey.f0,
+                                            categoricalKey.f1,
+                                            labelAndCount.f0,
+                                            labelAndCount.f1)));
+                }
+            }
+        }
+
+        @Override
+        public void processElement(StreamRecord<Tuple4<String, Object, Object, Long>> element) {
+            Tuple4<String, Object, Object, Long> colAndCategoryAndLabelAndCount =
+                    element.getValue();
+            Tuple2<String, Object> key =
+                    new Tuple2<>(
+                            colAndCategoryAndLabelAndCount.f0, colAndCategoryAndLabelAndCount.f1);
+            Tuple2<Object, Long> labelAndCount =
+                    new Tuple2<>(
+                            colAndCategoryAndLabelAndCount.f2, colAndCategoryAndLabelAndCount.f3);
+            ArrayList<Tuple2<Object, Long>> labelAndCountList = values.get(key);
+
+            if (labelAndCountList == null) {
+                ArrayList<Tuple2<Object, Long>> value = new ArrayList<>();
+                value.add(labelAndCount);
+                values.put(key, value);
+            } else {
+                labelAndCountList.add(labelAndCount);
+            }
+
+            numLabels.add(colAndCategoryAndLabelAndCount.f2);
+        }
+    }
+
+    /**
+     * Conduct Pearson's independence test in a contingency table that constructed from the input
+     * `observed` for each categorical values.
+     */
+    private static class ChiSqFunc
+            extends RichMapFunction<
+                    Tuple4<String, Object, Object, Long>, Tuple3<String, Double, Integer>> {
+
+        private final String bcCategoricalMarginsKey;
+        private final String bcLabelMarginsKey;
+        private final Map<Tuple2<String, Object>, Long> categoricalMargins = new HashMap<>();
+        private final Map<Tuple2<String, Object>, Long> labelMargins = new HashMap<>();
+
+        double sampleSize = 0;
+        int numLabels = 0;
+        HashMap<String, Integer> col2NumCategories = new HashMap<>();
+
+        public ChiSqFunc(String bcCategoricalMarginsKey, String bcLabelMarginsKey) {
+            this.bcCategoricalMarginsKey = bcCategoricalMarginsKey;
+            this.bcLabelMarginsKey = bcLabelMarginsKey;
+        }
+
+        @Override
+        public Tuple3<String, Double, Integer> map(Tuple4<String, Object, Object, Long> v) {
+            if (categoricalMargins.isEmpty()) {
+                List<Tuple3<String, Object, Long>> categoricalMarginList =
+                        getRuntimeContext().getBroadcastVariable(bcCategoricalMarginsKey);
+                List<Tuple3<String, Object, Long>> labelMarginList =
+                        getRuntimeContext().getBroadcastVariable(bcLabelMarginsKey);
+
+                for (Tuple3<String, Object, Long> colAndFeatureAndCount : categoricalMarginList) {
+                    String theColName = colAndFeatureAndCount.f0;
+                    col2NumCategories.merge(theColName, 1, Integer::sum);
+                }
+
+                numLabels = (int) labelMarginList.stream().map(x -> x.f1).distinct().count();
+
+                for (Tuple3<String, Object, Long> colAndFeatureAndCount : categoricalMarginList) {
+                    categoricalMargins.put(
+                            new Tuple2<>(colAndFeatureAndCount.f0, colAndFeatureAndCount.f1),
+                            colAndFeatureAndCount.f2);
+                }
+
+                Map<String, Double> sampleSizeCount = new HashMap<>();
+                String tmpKey = null;
+
+                for (Tuple3<String, Object, Long> colAndLabelAndCount : labelMarginList) {
+                    String col = colAndLabelAndCount.f0;
+
+                    if (tmpKey == null) {
+                        tmpKey = col;
+                        sampleSizeCount.put(col, 0D);
+                    }
+
+                    sampleSizeCount.computeIfPresent(
+                            col, (k, count) -> count + colAndLabelAndCount.f2);
+                    labelMargins.put(
+                            new Tuple2<>(col, colAndLabelAndCount.f1), colAndLabelAndCount.f2);
+                }
+
+                Optional<Double> sampleSizeOpt =
+                        sampleSizeCount.values().stream().reduce(Double::sum);
+                Preconditions.checkArgument(sampleSizeOpt.isPresent());
+                sampleSize = sampleSizeOpt.get();
+            }
+
+            String colName = v.f0;
+            // Degrees of freedom
+            int dof = (col2NumCategories.get(colName) - 1) * (numLabels - 1);
+
+            Tuple2<String, Object> category = new Tuple2<>(v.f0, v.f1);
+
+            Tuple2<String, Object> colAndLabelKey = new Tuple2<>(v.f0, v.f2);
+            Long theCategoricalMargin = categoricalMargins.get(category);
+            Long theLabelMargin = labelMargins.get(colAndLabelKey);
+            Long observed = v.f3;
+
+            double expected = (double) (theLabelMargin * theCategoricalMargin) / sampleSize;
+            double categoricalStatistic = pearsonFunc(observed, expected);
+
+            return new Tuple3<>(colName, categoricalStatistic, dof);
+        }
+
+        // Pearson's chi-squared test: http://en.wikipedia.org/wiki/Pearson%27s_chi-squared_test
+        private double pearsonFunc(double observed, double expected) {
+            double dev = observed - expected;
+            return dev * dev / expected;
+        }
+    }
+
+    /**
+     * This function computes the Pearson's chi-squared statistic , p-value ,and the number of
+     * degrees of freedom for every feature across the input DataStream.
+     */
+    private static class AggregateChiSqFunc extends AbstractStreamOperator<Row>
+            implements OneInputStreamOperator<Tuple3<String, Double, Integer>, Row>,
+                    BoundedOneInput {
+
+        HashMap<String, Tuple2<Double, Integer>> col2Statistic = new HashMap<>();

Review Comment:
   I added the `state `for `col2statistics `in `AggregateChiSqFunc`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] zhipeng93 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
zhipeng93 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r932822729


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,655 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square test of independence of variables in a contingency table. This function computes
+ * the chi-square statistic and p-value and dof(number of degrees of freedom) for every feature in
+ * the contingency table. The contingency table is constructed from the observed of categorical
+ * values.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+        Preconditions.checkArgument(inputs.length == 1);
+
+        final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+        final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies

Review Comment:
   Could you update the java doc as Computes the frequency of the observed features. ? We usually use third person to describe the java doc and capitalize the first letter.
   
   Also could please go over other docs in this PR?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] zhipeng93 commented on pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
zhipeng93 commented on PR #132:
URL: https://github.com/apache/flink-ml/pull/132#issuecomment-1204646476

   Thanks for the update @taosiyuan163 . LGTM :)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] zhipeng93 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
zhipeng93 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r934122580


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,654 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square Test is an AlgoOperator that computes the statistics of independence of variables

Review Comment:
   How about remove `is an AlgoOperator` in this line and update the java doc as:
   
   ```
   Chi-square Test computes the statistics of independence of variables in a contingency table, e.g., p-value, and DOF(number of degrees of freedom) for each input feature. The contingency table is constructed from the observed categorical values.
   ```



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,654 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square Test is an AlgoOperator that computes the statistics of independence of variables
+ * in a contingency table. This function computes the chi-square statistic, p-value, and DOF(number
+ * of degrees of freedom) for every feature in the contingency table. The contingency table is
+ * constructed from the observed categorical values.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+        Preconditions.checkArgument(inputs.length == 1);
+
+        final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+        final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                colAndFeatureAndLabel
+                        .keyBy(Tuple3::hashCode)
+                        .transform(
+                                "GenerateObservedFrequencies",
+                                TypeInformation.of(
+                                        new TypeHint<Tuple4<String, Object, Object, Long>>() {}),
+                                new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                observedFreq
+                        .keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode())
+                        .transform(
+                                "AggregateCategoricalMargins",
+                                TypeInformation.of(new TypeHint<Tuple3<String, Object, Long>>() {}),
+                                new AggregateCategoricalMargins());
+
+        DataStream<Tuple3<String, Object, Long>> labelMargins =
+                observedFreq
+                        .keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f2).hashCode())
+                        .transform(
+                                "AggregateLabelMargins",
+                                TypeInformation.of(new TypeHint<Tuple3<String, Object, Long>>() {}),
+                                new AggregateLabelMargins());
+
+        Function<List<DataStream<?>>, DataStream<Tuple3<String, Double, Integer>>> function =
+                dataStreams -> {
+                    DataStream stream = dataStreams.get(0);
+                    return stream.map(new ChiSqFunc(bcCategoricalMarginsKey, bcLabelMarginsKey));
+                };
+
+        HashMap<String, DataStream<?>> bcMap =
+                new HashMap<String, DataStream<?>>() {
+                    {
+                        put(bcCategoricalMarginsKey, categoricalMargins);
+                        put(bcLabelMarginsKey, labelMargins);
+                    }
+                };
+
+        DataStream<Tuple3<String, Double, Integer>> categoricalStatistics =
+                BroadcastUtils.withBroadcastStream(
+                        Collections.singletonList(filledObservedFreq), bcMap, function);
+
+        SingleOutputStreamOperator<Row> chiSqTestResult =
+                categoricalStatistics
+                        .transform(
+                                "chiSqTestResult",
+                                new RowTypeInfo(
+                                        new TypeInformation[] {
+                                            Types.STRING, Types.DOUBLE, Types.DOUBLE, Types.INT
+                                        },
+                                        new String[] {
+                                            "column", "pValue", "statistic", "degreesOfFreedom"
+                                        }),
+                                new AggregateChiSqFunc())
+                        .setParallelism(1);
+
+        return new Table[] {tEnv.fromDataStream(chiSqTestResult)};
+    }
+
+    @Override
+    public void save(String path) throws IOException {
+        ReadWriteUtils.saveMetadata(this, path);
+    }
+
+    public static ChiSqTest load(StreamTableEnvironment tEnv, String path) throws IOException {
+        return ReadWriteUtils.loadStageParam(path);
+    }
+
+    @Override
+    public Map<Param<?>, Object> getParamMap() {
+        return paramMap;
+    }
+
+    private static class ExtractColAndFeatureAndLabel
+            extends RichFlatMapFunction<Row, Tuple3<String, Object, Object>> {
+        private final String[] inputCols;
+        private final String labelCol;
+
+        public ExtractColAndFeatureAndLabel(String[] inputCols, String labelCol) {
+            this.inputCols = inputCols;
+            this.labelCol = labelCol;
+        }
+
+        @Override
+        public void flatMap(Row row, Collector<Tuple3<String, Object, Object>> collector) {
+
+            Object label = row.getFieldAs(labelCol);
+
+            for (String colName : inputCols) {
+                Object value = row.getField(colName);
+                collector.collect(new Tuple3<>(colName, value, label));
+            }
+        }
+    }
+
+    /**
+     * Computes a frequency table(DataStream) of the factors(categorical values). The returned

Review Comment:
   How about update the java doc as:
   ```
   Computes the frequency of each feature value at different columns by labels. An output record
   (columnA, featureValueB, labelC, countD) represents that A feature value {featureValueB} with
   label {labelC} at column {columnA} has appeared {countD} times in the input table.
   ```



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,654 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square Test is an AlgoOperator that computes the statistics of independence of variables
+ * in a contingency table. This function computes the chi-square statistic, p-value, and DOF(number
+ * of degrees of freedom) for every feature in the contingency table. The contingency table is
+ * constructed from the observed categorical values.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+        Preconditions.checkArgument(inputs.length == 1);
+
+        final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+        final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                colAndFeatureAndLabel
+                        .keyBy(Tuple3::hashCode)
+                        .transform(
+                                "GenerateObservedFrequencies",
+                                TypeInformation.of(
+                                        new TypeHint<Tuple4<String, Object, Object, Long>>() {}),
+                                new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                observedFreq
+                        .keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode())
+                        .transform(
+                                "AggregateCategoricalMargins",
+                                TypeInformation.of(new TypeHint<Tuple3<String, Object, Long>>() {}),
+                                new AggregateCategoricalMargins());
+
+        DataStream<Tuple3<String, Object, Long>> labelMargins =
+                observedFreq
+                        .keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f2).hashCode())
+                        .transform(
+                                "AggregateLabelMargins",
+                                TypeInformation.of(new TypeHint<Tuple3<String, Object, Long>>() {}),
+                                new AggregateLabelMargins());
+
+        Function<List<DataStream<?>>, DataStream<Tuple3<String, Double, Integer>>> function =
+                dataStreams -> {
+                    DataStream stream = dataStreams.get(0);
+                    return stream.map(new ChiSqFunc(bcCategoricalMarginsKey, bcLabelMarginsKey));
+                };
+
+        HashMap<String, DataStream<?>> bcMap =
+                new HashMap<String, DataStream<?>>() {
+                    {
+                        put(bcCategoricalMarginsKey, categoricalMargins);
+                        put(bcLabelMarginsKey, labelMargins);
+                    }
+                };
+
+        DataStream<Tuple3<String, Double, Integer>> categoricalStatistics =
+                BroadcastUtils.withBroadcastStream(
+                        Collections.singletonList(filledObservedFreq), bcMap, function);
+
+        SingleOutputStreamOperator<Row> chiSqTestResult =
+                categoricalStatistics
+                        .transform(
+                                "chiSqTestResult",
+                                new RowTypeInfo(
+                                        new TypeInformation[] {
+                                            Types.STRING, Types.DOUBLE, Types.DOUBLE, Types.INT
+                                        },
+                                        new String[] {
+                                            "column", "pValue", "statistic", "degreesOfFreedom"
+                                        }),
+                                new AggregateChiSqFunc())
+                        .setParallelism(1);
+
+        return new Table[] {tEnv.fromDataStream(chiSqTestResult)};
+    }
+
+    @Override
+    public void save(String path) throws IOException {
+        ReadWriteUtils.saveMetadata(this, path);
+    }
+
+    public static ChiSqTest load(StreamTableEnvironment tEnv, String path) throws IOException {
+        return ReadWriteUtils.loadStageParam(path);
+    }
+
+    @Override
+    public Map<Param<?>, Object> getParamMap() {
+        return paramMap;
+    }
+
+    private static class ExtractColAndFeatureAndLabel
+            extends RichFlatMapFunction<Row, Tuple3<String, Object, Object>> {
+        private final String[] inputCols;
+        private final String labelCol;
+
+        public ExtractColAndFeatureAndLabel(String[] inputCols, String labelCol) {
+            this.inputCols = inputCols;
+            this.labelCol = labelCol;
+        }
+
+        @Override
+        public void flatMap(Row row, Collector<Tuple3<String, Object, Object>> collector) {
+
+            Object label = row.getFieldAs(labelCol);
+
+            for (String colName : inputCols) {
+                Object value = row.getField(colName);
+                collector.collect(new Tuple3<>(colName, value, label));
+            }
+        }
+    }
+
+    /**
+     * Computes a frequency table(DataStream) of the factors(categorical values). The returned
+     * DataStream contains the observed frequencies (i.e. number of occurrences) in each category.
+     */
+    private static class GenerateObservedFrequencies
+            extends AbstractStreamOperator<Tuple4<String, Object, Object, Long>>
+            implements OneInputStreamOperator<
+                            Tuple3<String, Object, Object>, Tuple4<String, Object, Object, Long>>,
+                    BoundedOneInput {
+
+        private HashMap<Tuple3<String, Object, Object>, Long> cntMap = new HashMap<>();
+        private ListState<HashMap<Tuple3<String, Object, Object>, Long>> cntMapState;
+
+        @Override
+        public void endInput() {
+            for (Tuple3<String, Object, Object> key : cntMap.keySet()) {
+                Long count = cntMap.get(key);
+                output.collect(new StreamRecord<>(new Tuple4<>(key.f0, key.f1, key.f2, count)));
+            }
+            cntMapState.clear();
+        }
+
+        @Override
+        public void processElement(StreamRecord<Tuple3<String, Object, Object>> element) {
+
+            Tuple3<String, Object, Object> colAndCategoryAndLabel = element.getValue();
+            cntMap.compute(colAndCategoryAndLabel, (k, v) -> (v == null ? 1 : v + 1));
+        }
+
+        @Override
+        public void initializeState(StateInitializationContext context) throws Exception {
+            super.initializeState(context);
+            cntMapState =
+                    context.getOperatorStateStore()
+                            .getListState(
+                                    new ListStateDescriptor<>(
+                                            "cntMapState",
+                                            TypeInformation.of(
+                                                    new TypeHint<
+                                                            HashMap<
+                                                                    Tuple3<String, Object, Object>,
+                                                                    Long>>() {})));
+
+            OperatorStateUtils.getUniqueElement(cntMapState, "cntMapState")
+                    .ifPresent(x -> cntMap = x);
+        }
+
+        @Override
+        public void snapshotState(StateSnapshotContext context) throws Exception {
+            super.snapshotState(context);
+            cntMapState.update(Collections.singletonList(cntMap));
+        }
+    }
+
+    /** Fills the factors which frequencies are zero in frequency table. */

Review Comment:
   How about change the function name to `fillFrequencyTable` and update the java doc as follows: 
   ```
   Fills the frequency table by setting the frequency of missed elements (i.e., missed combinations of column, featureValue and labelValue) as zero.
   ```



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,654 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square Test is an AlgoOperator that computes the statistics of independence of variables
+ * in a contingency table. This function computes the chi-square statistic, p-value, and DOF(number
+ * of degrees of freedom) for every feature in the contingency table. The contingency table is
+ * constructed from the observed categorical values.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+        Preconditions.checkArgument(inputs.length == 1);
+
+        final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+        final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                colAndFeatureAndLabel
+                        .keyBy(Tuple3::hashCode)
+                        .transform(
+                                "GenerateObservedFrequencies",
+                                TypeInformation.of(
+                                        new TypeHint<Tuple4<String, Object, Object, Long>>() {}),
+                                new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                observedFreq
+                        .keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode())
+                        .transform(
+                                "AggregateCategoricalMargins",
+                                TypeInformation.of(new TypeHint<Tuple3<String, Object, Long>>() {}),
+                                new AggregateCategoricalMargins());
+
+        DataStream<Tuple3<String, Object, Long>> labelMargins =
+                observedFreq
+                        .keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f2).hashCode())
+                        .transform(
+                                "AggregateLabelMargins",
+                                TypeInformation.of(new TypeHint<Tuple3<String, Object, Long>>() {}),
+                                new AggregateLabelMargins());
+
+        Function<List<DataStream<?>>, DataStream<Tuple3<String, Double, Integer>>> function =
+                dataStreams -> {
+                    DataStream stream = dataStreams.get(0);
+                    return stream.map(new ChiSqFunc(bcCategoricalMarginsKey, bcLabelMarginsKey));
+                };
+
+        HashMap<String, DataStream<?>> bcMap =
+                new HashMap<String, DataStream<?>>() {
+                    {
+                        put(bcCategoricalMarginsKey, categoricalMargins);
+                        put(bcLabelMarginsKey, labelMargins);
+                    }
+                };
+
+        DataStream<Tuple3<String, Double, Integer>> categoricalStatistics =
+                BroadcastUtils.withBroadcastStream(
+                        Collections.singletonList(filledObservedFreq), bcMap, function);
+
+        SingleOutputStreamOperator<Row> chiSqTestResult =
+                categoricalStatistics
+                        .transform(
+                                "chiSqTestResult",
+                                new RowTypeInfo(
+                                        new TypeInformation[] {
+                                            Types.STRING, Types.DOUBLE, Types.DOUBLE, Types.INT
+                                        },
+                                        new String[] {
+                                            "column", "pValue", "statistic", "degreesOfFreedom"
+                                        }),
+                                new AggregateChiSqFunc())
+                        .setParallelism(1);
+
+        return new Table[] {tEnv.fromDataStream(chiSqTestResult)};
+    }
+
+    @Override
+    public void save(String path) throws IOException {
+        ReadWriteUtils.saveMetadata(this, path);
+    }
+
+    public static ChiSqTest load(StreamTableEnvironment tEnv, String path) throws IOException {
+        return ReadWriteUtils.loadStageParam(path);
+    }
+
+    @Override
+    public Map<Param<?>, Object> getParamMap() {
+        return paramMap;
+    }
+
+    private static class ExtractColAndFeatureAndLabel
+            extends RichFlatMapFunction<Row, Tuple3<String, Object, Object>> {
+        private final String[] inputCols;
+        private final String labelCol;
+
+        public ExtractColAndFeatureAndLabel(String[] inputCols, String labelCol) {
+            this.inputCols = inputCols;
+            this.labelCol = labelCol;
+        }
+
+        @Override
+        public void flatMap(Row row, Collector<Tuple3<String, Object, Object>> collector) {
+
+            Object label = row.getFieldAs(labelCol);
+
+            for (String colName : inputCols) {
+                Object value = row.getField(colName);
+                collector.collect(new Tuple3<>(colName, value, label));
+            }
+        }
+    }
+
+    /**
+     * Computes a frequency table(DataStream) of the factors(categorical values). The returned
+     * DataStream contains the observed frequencies (i.e. number of occurrences) in each category.
+     */
+    private static class GenerateObservedFrequencies
+            extends AbstractStreamOperator<Tuple4<String, Object, Object, Long>>
+            implements OneInputStreamOperator<
+                            Tuple3<String, Object, Object>, Tuple4<String, Object, Object, Long>>,
+                    BoundedOneInput {
+
+        private HashMap<Tuple3<String, Object, Object>, Long> cntMap = new HashMap<>();
+        private ListState<HashMap<Tuple3<String, Object, Object>, Long>> cntMapState;
+
+        @Override
+        public void endInput() {
+            for (Tuple3<String, Object, Object> key : cntMap.keySet()) {
+                Long count = cntMap.get(key);
+                output.collect(new StreamRecord<>(new Tuple4<>(key.f0, key.f1, key.f2, count)));
+            }
+            cntMapState.clear();
+        }
+
+        @Override
+        public void processElement(StreamRecord<Tuple3<String, Object, Object>> element) {
+
+            Tuple3<String, Object, Object> colAndCategoryAndLabel = element.getValue();
+            cntMap.compute(colAndCategoryAndLabel, (k, v) -> (v == null ? 1 : v + 1));
+        }
+
+        @Override
+        public void initializeState(StateInitializationContext context) throws Exception {
+            super.initializeState(context);
+            cntMapState =
+                    context.getOperatorStateStore()
+                            .getListState(
+                                    new ListStateDescriptor<>(
+                                            "cntMapState",
+                                            TypeInformation.of(
+                                                    new TypeHint<
+                                                            HashMap<
+                                                                    Tuple3<String, Object, Object>,
+                                                                    Long>>() {})));
+
+            OperatorStateUtils.getUniqueElement(cntMapState, "cntMapState")
+                    .ifPresent(x -> cntMap = x);
+        }
+
+        @Override
+        public void snapshotState(StateSnapshotContext context) throws Exception {
+            super.snapshotState(context);
+            cntMapState.update(Collections.singletonList(cntMap));
+        }
+    }
+
+    /** Fills the factors which frequencies are zero in frequency table. */
+    private static class FillZeroFunc
+            extends AbstractStreamOperator<Tuple4<String, Object, Object, Long>>
+            implements OneInputStreamOperator<
+                            Tuple4<String, Object, Object, Long>,
+                            Tuple4<String, Object, Object, Long>>,
+                    BoundedOneInput {
+
+        private HashMap<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>> valuesMap =
+                new HashMap<>();
+        private HashSet<Object> distinctLabels = new HashSet<>();
+
+        private ListState<HashMap<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>>>
+                valuesMapState;
+        private ListState<HashSet<Object>> distinctLabelsState;
+
+        @Override
+        public void endInput() {
+
+            for (Map.Entry<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>> entry :
+                    valuesMap.entrySet()) {
+                ArrayList<Tuple2<Object, Long>> labelAndCountList = entry.getValue();
+                Tuple2<String, Object> categoricalKey = entry.getKey();
+
+                List<Object> existingLabels =
+                        labelAndCountList.stream().map(v -> v.f0).collect(Collectors.toList());
+
+                for (Object label : distinctLabels) {
+                    if (!existingLabels.contains(label)) {
+                        Tuple2<Object, Long> generatedLabelCount = new Tuple2<>(label, 0L);
+                        labelAndCountList.add(generatedLabelCount);
+                    }
+                }
+
+                for (Tuple2<Object, Long> labelAndCount : labelAndCountList) {
+                    output.collect(
+                            new StreamRecord<>(
+                                    new Tuple4<>(
+                                            categoricalKey.f0,
+                                            categoricalKey.f1,
+                                            labelAndCount.f0,
+                                            labelAndCount.f1)));
+                }
+            }
+
+            valuesMapState.clear();
+            distinctLabelsState.clear();
+        }
+
+        @Override
+        public void processElement(StreamRecord<Tuple4<String, Object, Object, Long>> element) {
+            Tuple4<String, Object, Object, Long> colAndCategoryAndLabelAndCount =
+                    element.getValue();
+            Tuple2<String, Object> key =
+                    new Tuple2<>(
+                            colAndCategoryAndLabelAndCount.f0, colAndCategoryAndLabelAndCount.f1);
+            Tuple2<Object, Long> labelAndCount =
+                    new Tuple2<>(
+                            colAndCategoryAndLabelAndCount.f2, colAndCategoryAndLabelAndCount.f3);
+            ArrayList<Tuple2<Object, Long>> labelAndCountList = valuesMap.get(key);
+
+            if (labelAndCountList == null) {
+                ArrayList<Tuple2<Object, Long>> value = new ArrayList<>();
+                value.add(labelAndCount);
+                valuesMap.put(key, value);
+            } else {
+                labelAndCountList.add(labelAndCount);
+            }
+
+            distinctLabels.add(colAndCategoryAndLabelAndCount.f2);
+        }
+
+        @Override
+        public void initializeState(StateInitializationContext context) throws Exception {
+            super.initializeState(context);
+            valuesMapState =
+                    context.getOperatorStateStore()
+                            .getListState(
+                                    new ListStateDescriptor<>(
+                                            "valuesMapState",
+                                            TypeInformation.of(
+                                                    new TypeHint<
+                                                            HashMap<
+                                                                    Tuple2<String, Object>,
+                                                                    ArrayList<
+                                                                            Tuple2<
+                                                                                    Object,
+                                                                                    Long>>>>() {})));
+            distinctLabelsState =
+                    context.getOperatorStateStore()
+                            .getListState(
+                                    new ListStateDescriptor<>(
+                                            "distinctLabelsState",
+                                            TypeInformation.of(
+                                                    new TypeHint<HashSet<Object>>() {})));
+
+            OperatorStateUtils.getUniqueElement(valuesMapState, "valuesMapState")
+                    .ifPresent(x -> valuesMap = x);
+
+            OperatorStateUtils.getUniqueElement(distinctLabelsState, "distinctLabelsState")
+                    .ifPresent(x -> distinctLabels = x);
+        }
+
+        @Override
+        public void snapshotState(StateSnapshotContext context) throws Exception {
+            super.snapshotState(context);
+            valuesMapState.update(Collections.singletonList(valuesMap));
+            distinctLabelsState.update(Collections.singletonList(distinctLabels));
+        }
+    }
+
+    /** Returns a DataStream of the marginal sums of the factors. */

Review Comment:
   How about update the java doc as:
   `Computes the marginal sums of different categories.`



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,654 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square Test is an AlgoOperator that computes the statistics of independence of variables
+ * in a contingency table. This function computes the chi-square statistic, p-value, and DOF(number
+ * of degrees of freedom) for every feature in the contingency table. The contingency table is
+ * constructed from the observed categorical values.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+        Preconditions.checkArgument(inputs.length == 1);
+
+        final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+        final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                colAndFeatureAndLabel
+                        .keyBy(Tuple3::hashCode)
+                        .transform(
+                                "GenerateObservedFrequencies",
+                                TypeInformation.of(
+                                        new TypeHint<Tuple4<String, Object, Object, Long>>() {}),
+                                new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                observedFreq
+                        .keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode())
+                        .transform(
+                                "AggregateCategoricalMargins",
+                                TypeInformation.of(new TypeHint<Tuple3<String, Object, Long>>() {}),
+                                new AggregateCategoricalMargins());
+
+        DataStream<Tuple3<String, Object, Long>> labelMargins =
+                observedFreq
+                        .keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f2).hashCode())
+                        .transform(
+                                "AggregateLabelMargins",
+                                TypeInformation.of(new TypeHint<Tuple3<String, Object, Long>>() {}),
+                                new AggregateLabelMargins());
+
+        Function<List<DataStream<?>>, DataStream<Tuple3<String, Double, Integer>>> function =
+                dataStreams -> {
+                    DataStream stream = dataStreams.get(0);
+                    return stream.map(new ChiSqFunc(bcCategoricalMarginsKey, bcLabelMarginsKey));
+                };
+
+        HashMap<String, DataStream<?>> bcMap =
+                new HashMap<String, DataStream<?>>() {
+                    {
+                        put(bcCategoricalMarginsKey, categoricalMargins);
+                        put(bcLabelMarginsKey, labelMargins);
+                    }
+                };
+
+        DataStream<Tuple3<String, Double, Integer>> categoricalStatistics =
+                BroadcastUtils.withBroadcastStream(
+                        Collections.singletonList(filledObservedFreq), bcMap, function);
+
+        SingleOutputStreamOperator<Row> chiSqTestResult =
+                categoricalStatistics
+                        .transform(
+                                "chiSqTestResult",
+                                new RowTypeInfo(
+                                        new TypeInformation[] {
+                                            Types.STRING, Types.DOUBLE, Types.DOUBLE, Types.INT
+                                        },
+                                        new String[] {
+                                            "column", "pValue", "statistic", "degreesOfFreedom"
+                                        }),
+                                new AggregateChiSqFunc())
+                        .setParallelism(1);
+
+        return new Table[] {tEnv.fromDataStream(chiSqTestResult)};
+    }
+
+    @Override
+    public void save(String path) throws IOException {
+        ReadWriteUtils.saveMetadata(this, path);
+    }
+
+    public static ChiSqTest load(StreamTableEnvironment tEnv, String path) throws IOException {
+        return ReadWriteUtils.loadStageParam(path);
+    }
+
+    @Override
+    public Map<Param<?>, Object> getParamMap() {
+        return paramMap;
+    }
+
+    private static class ExtractColAndFeatureAndLabel
+            extends RichFlatMapFunction<Row, Tuple3<String, Object, Object>> {
+        private final String[] inputCols;
+        private final String labelCol;
+
+        public ExtractColAndFeatureAndLabel(String[] inputCols, String labelCol) {
+            this.inputCols = inputCols;
+            this.labelCol = labelCol;
+        }
+
+        @Override
+        public void flatMap(Row row, Collector<Tuple3<String, Object, Object>> collector) {
+
+            Object label = row.getFieldAs(labelCol);
+
+            for (String colName : inputCols) {
+                Object value = row.getField(colName);
+                collector.collect(new Tuple3<>(colName, value, label));
+            }
+        }
+    }
+
+    /**
+     * Computes a frequency table(DataStream) of the factors(categorical values). The returned
+     * DataStream contains the observed frequencies (i.e. number of occurrences) in each category.
+     */
+    private static class GenerateObservedFrequencies
+            extends AbstractStreamOperator<Tuple4<String, Object, Object, Long>>
+            implements OneInputStreamOperator<
+                            Tuple3<String, Object, Object>, Tuple4<String, Object, Object, Long>>,
+                    BoundedOneInput {
+
+        private HashMap<Tuple3<String, Object, Object>, Long> cntMap = new HashMap<>();
+        private ListState<HashMap<Tuple3<String, Object, Object>, Long>> cntMapState;
+
+        @Override
+        public void endInput() {
+            for (Tuple3<String, Object, Object> key : cntMap.keySet()) {
+                Long count = cntMap.get(key);
+                output.collect(new StreamRecord<>(new Tuple4<>(key.f0, key.f1, key.f2, count)));
+            }
+            cntMapState.clear();
+        }
+
+        @Override
+        public void processElement(StreamRecord<Tuple3<String, Object, Object>> element) {
+
+            Tuple3<String, Object, Object> colAndCategoryAndLabel = element.getValue();
+            cntMap.compute(colAndCategoryAndLabel, (k, v) -> (v == null ? 1 : v + 1));
+        }
+
+        @Override
+        public void initializeState(StateInitializationContext context) throws Exception {
+            super.initializeState(context);
+            cntMapState =
+                    context.getOperatorStateStore()
+                            .getListState(
+                                    new ListStateDescriptor<>(
+                                            "cntMapState",
+                                            TypeInformation.of(
+                                                    new TypeHint<
+                                                            HashMap<
+                                                                    Tuple3<String, Object, Object>,
+                                                                    Long>>() {})));
+
+            OperatorStateUtils.getUniqueElement(cntMapState, "cntMapState")
+                    .ifPresent(x -> cntMap = x);
+        }
+
+        @Override
+        public void snapshotState(StateSnapshotContext context) throws Exception {
+            super.snapshotState(context);
+            cntMapState.update(Collections.singletonList(cntMap));
+        }
+    }
+
+    /** Fills the factors which frequencies are zero in frequency table. */
+    private static class FillZeroFunc
+            extends AbstractStreamOperator<Tuple4<String, Object, Object, Long>>
+            implements OneInputStreamOperator<
+                            Tuple4<String, Object, Object, Long>,
+                            Tuple4<String, Object, Object, Long>>,
+                    BoundedOneInput {
+
+        private HashMap<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>> valuesMap =
+                new HashMap<>();
+        private HashSet<Object> distinctLabels = new HashSet<>();
+
+        private ListState<HashMap<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>>>
+                valuesMapState;
+        private ListState<HashSet<Object>> distinctLabelsState;
+
+        @Override
+        public void endInput() {
+
+            for (Map.Entry<Tuple2<String, Object>, ArrayList<Tuple2<Object, Long>>> entry :
+                    valuesMap.entrySet()) {
+                ArrayList<Tuple2<Object, Long>> labelAndCountList = entry.getValue();
+                Tuple2<String, Object> categoricalKey = entry.getKey();
+
+                List<Object> existingLabels =
+                        labelAndCountList.stream().map(v -> v.f0).collect(Collectors.toList());
+
+                for (Object label : distinctLabels) {
+                    if (!existingLabels.contains(label)) {
+                        Tuple2<Object, Long> generatedLabelCount = new Tuple2<>(label, 0L);
+                        labelAndCountList.add(generatedLabelCount);
+                    }
+                }
+
+                for (Tuple2<Object, Long> labelAndCount : labelAndCountList) {
+                    output.collect(
+                            new StreamRecord<>(
+                                    new Tuple4<>(
+                                            categoricalKey.f0,
+                                            categoricalKey.f1,
+                                            labelAndCount.f0,
+                                            labelAndCount.f1)));
+                }
+            }
+
+            valuesMapState.clear();
+            distinctLabelsState.clear();
+        }
+
+        @Override
+        public void processElement(StreamRecord<Tuple4<String, Object, Object, Long>> element) {
+            Tuple4<String, Object, Object, Long> colAndCategoryAndLabelAndCount =
+                    element.getValue();
+            Tuple2<String, Object> key =
+                    new Tuple2<>(
+                            colAndCategoryAndLabelAndCount.f0, colAndCategoryAndLabelAndCount.f1);
+            Tuple2<Object, Long> labelAndCount =
+                    new Tuple2<>(
+                            colAndCategoryAndLabelAndCount.f2, colAndCategoryAndLabelAndCount.f3);
+            ArrayList<Tuple2<Object, Long>> labelAndCountList = valuesMap.get(key);
+
+            if (labelAndCountList == null) {
+                ArrayList<Tuple2<Object, Long>> value = new ArrayList<>();
+                value.add(labelAndCount);
+                valuesMap.put(key, value);
+            } else {
+                labelAndCountList.add(labelAndCount);
+            }
+
+            distinctLabels.add(colAndCategoryAndLabelAndCount.f2);
+        }
+
+        @Override
+        public void initializeState(StateInitializationContext context) throws Exception {
+            super.initializeState(context);
+            valuesMapState =
+                    context.getOperatorStateStore()
+                            .getListState(
+                                    new ListStateDescriptor<>(
+                                            "valuesMapState",
+                                            TypeInformation.of(
+                                                    new TypeHint<
+                                                            HashMap<
+                                                                    Tuple2<String, Object>,
+                                                                    ArrayList<
+                                                                            Tuple2<
+                                                                                    Object,
+                                                                                    Long>>>>() {})));
+            distinctLabelsState =
+                    context.getOperatorStateStore()
+                            .getListState(
+                                    new ListStateDescriptor<>(
+                                            "distinctLabelsState",
+                                            TypeInformation.of(
+                                                    new TypeHint<HashSet<Object>>() {})));
+
+            OperatorStateUtils.getUniqueElement(valuesMapState, "valuesMapState")
+                    .ifPresent(x -> valuesMap = x);
+
+            OperatorStateUtils.getUniqueElement(distinctLabelsState, "distinctLabelsState")
+                    .ifPresent(x -> distinctLabels = x);
+        }
+
+        @Override
+        public void snapshotState(StateSnapshotContext context) throws Exception {
+            super.snapshotState(context);
+            valuesMapState.update(Collections.singletonList(valuesMap));
+            distinctLabelsState.update(Collections.singletonList(distinctLabels));
+        }
+    }
+
+    /** Returns a DataStream of the marginal sums of the factors. */
+    private static class AggregateCategoricalMargins
+            extends AbstractStreamOperator<Tuple3<String, Object, Long>>
+            implements OneInputStreamOperator<
+                            Tuple4<String, Object, Object, Long>, Tuple3<String, Object, Long>>,
+                    BoundedOneInput {
+
+        private HashMap<Tuple2<String, Object>, Long> categoricalMarginsMap = new HashMap<>();
+
+        private ListState<HashMap<Tuple2<String, Object>, Long>> categoricalMarginsMapState;
+
+        @Override
+        public void endInput() {
+            for (Tuple2<String, Object> key : categoricalMarginsMap.keySet()) {
+                Long categoricalMargin = categoricalMarginsMap.get(key);
+                output.collect(new StreamRecord<>(new Tuple3<>(key.f0, key.f1, categoricalMargin)));
+            }
+            categoricalMarginsMap.clear();
+        }
+
+        @Override
+        public void processElement(StreamRecord<Tuple4<String, Object, Object, Long>> element) {
+
+            Tuple4<String, Object, Object, Long> colAndCategoryAndLabelAndCnt = element.getValue();
+            Tuple2<String, Object> key =
+                    new Tuple2<>(colAndCategoryAndLabelAndCnt.f0, colAndCategoryAndLabelAndCnt.f1);
+            Long observedFreq = colAndCategoryAndLabelAndCnt.f3;
+            categoricalMarginsMap.compute(
+                    key, (k, v) -> (v == null ? observedFreq : v + observedFreq));
+        }
+
+        @Override
+        public void initializeState(StateInitializationContext context) throws Exception {
+            super.initializeState(context);
+            categoricalMarginsMapState =
+                    context.getOperatorStateStore()
+                            .getListState(
+                                    new ListStateDescriptor<>(
+                                            "categoricalMarginsMapState",
+                                            TypeInformation.of(
+                                                    new TypeHint<
+                                                            HashMap<
+                                                                    Tuple2<String, Object>,
+                                                                    Long>>() {})));
+
+            OperatorStateUtils.getUniqueElement(
+                            categoricalMarginsMapState, "categoricalMarginsMapState")
+                    .ifPresent(x -> categoricalMarginsMap = x);
+        }
+
+        @Override
+        public void snapshotState(StateSnapshotContext context) throws Exception {
+            super.snapshotState(context);
+            categoricalMarginsMapState.update(Collections.singletonList(categoricalMarginsMap));
+        }
+    }
+
+    /** Returns a DataStream of the marginal sums of the labels. */

Review Comment:
   How about update the java doc as:
   `Computes the marginal sums of different labels.`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] yunfengzhou-hub commented on a diff in pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
yunfengzhou-hub commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r924035970


##########
flink-ml-lib/src/test/java/org/apache/flink/ml/feature/ChiSqTestTransformerTest.java:
##########
@@ -0,0 +1,144 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.feature;
+
+import org.apache.flink.api.common.restartstrategy.RestartStrategies;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.ml.feature.chisqtest.ChiSqTestTransformer;
+import org.apache.flink.ml.util.TestUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.environment.ExecutionCheckpointingOptions;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.types.Row;
+
+import org.apache.commons.collections.IteratorUtils;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import java.util.Arrays;
+import java.util.List;
+
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+
+/** Tests {@link ChiSqTestTransformerTest}. */
+public class ChiSqTestTransformerTest extends AbstractTestBase {
+    @Rule public final TemporaryFolder tempFolder = new TemporaryFolder();
+    private StreamTableEnvironment tEnv;
+    private Table inputTable;
+
+    private final List<Row> samples =
+            Arrays.asList(
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(2., 5, 1.),
+                    Row.of(0., 5, 2.),
+                    Row.of(0., 5, 2.),
+                    Row.of(1., 9, 4.),
+                    Row.of(1., 9, 3.));

Review Comment:
   We would expect that the functionality of Flink ML algorithms could be aligned with existing practices. Have you referenced the function of any other machine learning libraries? 
   
   If not, could you please reference the ChisqTest in [Alink](https://github.com/alibaba/Alink)? For example, Alink supports labels to be of integer type, while the current implementation in this PR only supports doubles.



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/feature/chisqtest/ChiSqTestTransformer.java:
##########
@@ -0,0 +1,498 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.feature.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.Transformer;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.common.param.HasInputCols;
+import org.apache.flink.ml.common.param.HasLabelCol;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTestTransformer
+        implements Transformer<ChiSqTestTransformer>,

Review Comment:
   According to its Javadoc, `Transformer` has the semantics that for each input row, there would be one corresponding output transformed row. Thus it might be better for ChiSqlTest to be a subclass of `AlgoOperator`, instead of `Transformer`.
   Besides, do you think ChiSqTest can be treated as an Evaluator and placed in package `ml.evaluation`, like `BinaryClassificationEvaluatior`?



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/feature/chisqtest/ChiSqTestTransformer.java:
##########
@@ -0,0 +1,498 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.feature.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.Transformer;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.common.param.HasInputCols;
+import org.apache.flink.ml.common.param.HasLabelCol;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTestTransformer
+        implements Transformer<ChiSqTestTransformer>,
+                HasInputCols<ChiSqTestTransformer>,
+                HasLabelCol<ChiSqTestTransformer> {

Review Comment:
   Let's follow other classes to create a param class for each operator, like `BinaryClassificationEvaluatorParams` for `BinaryClassificationEvaluator`, and let the param class inherit all the required parameter interfaces.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r924394986


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/feature/chisqtest/ChiSqTestTransformer.java:
##########
@@ -0,0 +1,498 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.feature.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.Transformer;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.common.param.HasInputCols;
+import org.apache.flink.ml.common.param.HasLabelCol;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTestTransformer
+        implements Transformer<ChiSqTestTransformer>,

Review Comment:
   The `Chi-square test` is a statistical function,It does not apply to model evaluation. I think we can placing it in a new package called 'stats', which is the same level as 'feature' package and all of the statistically relevant methods can be put here in future
   ```suggestion
           implements AlgoOperator<ChiSqTestTransformer>,
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930911825


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        // return a DataStream of the marginal sums of the factors
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                DataStreamUtils.mapPartition(
+                        observedFreq.keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode()),
+                        new MapPartitionFunction<
+                                Tuple4<String, Object, Object, Long>,
+                                Tuple3<String, Object, Long>>() {
+                            @Override
+                            public void mapPartition(
+                                    Iterable<Tuple4<String, Object, Object, Long>> iterable,
+                                    Collector<Tuple3<String, Object, Long>> out) {
+                                HashMap<Tuple2<String, Object>, Long> map = new HashMap<>();
+
+                                for (Tuple4<String, Object, Object, Long> tuple : iterable) {
+                                    Long observedFreq = tuple.f3;
+                                    Tuple2<String, Object> key = new Tuple2<>(tuple.f0, tuple.f1);
+
+                                    if (map.containsKey(key)) {
+                                        Long count = map.get(key);
+                                        map.put(key, count + observedFreq);
+                                    } else {
+                                        map.put(key, observedFreq);
+                                    }
+                                }
+
+                                for (Tuple2<String, Object> key : map.keySet()) {
+                                    Long categoricalMargin = map.get(key);
+                                    out.collect(new Tuple3<>(key.f0, key.f1, categoricalMargin));
+                                }
+                            }
+                        });
+
+        // return a DataStream of the marginal sums of the labels
+        DataStream<Tuple3<String, Object, Long>> labelMargins =

Review Comment:
   As above, I used `transform()` and `BoundedOneInput#endInput()` to optimize the implementation.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930906258


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();

Review Comment:
   > Could we check the size of the `inputs` here?
   
   I added a method to check the size of the inputs
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r933059024


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,655 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square test of independence of variables in a contingency table. This function computes
+ * the chi-square statistic and p-value and dof(number of degrees of freedom) for every feature in
+ * the contingency table. The contingency table is constructed from the observed of categorical
+ * values.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+        Preconditions.checkArgument(inputs.length == 1);
+
+        final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+        final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies

Review Comment:
   Thanks for the comments.
   
   Part of the context in the docs is referenced from Python,I'm going to reformat it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r933061036


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,655 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square test of independence of variables in a contingency table. This function computes
+ * the chi-square statistic and p-value and dof(number of degrees of freedom) for every feature in
+ * the contingency table. The contingency table is constructed from the observed of categorical
+ * values.

Review Comment:
   Thanks for this document,I will update it.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r924412986


##########
flink-ml-lib/src/test/java/org/apache/flink/ml/feature/ChiSqTestTransformerTest.java:
##########
@@ -0,0 +1,144 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.feature;
+
+import org.apache.flink.api.common.restartstrategy.RestartStrategies;
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.ml.feature.chisqtest.ChiSqTestTransformer;
+import org.apache.flink.ml.util.TestUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.environment.ExecutionCheckpointingOptions;
+import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.test.util.AbstractTestBase;
+import org.apache.flink.types.Row;
+
+import org.apache.commons.collections.IteratorUtils;
+import org.junit.Before;
+import org.junit.Rule;
+import org.junit.Test;
+import org.junit.rules.TemporaryFolder;
+
+import java.util.Arrays;
+import java.util.List;
+
+import static org.junit.Assert.assertArrayEquals;
+import static org.junit.Assert.assertEquals;
+
+/** Tests {@link ChiSqTestTransformerTest}. */
+public class ChiSqTestTransformerTest extends AbstractTestBase {
+    @Rule public final TemporaryFolder tempFolder = new TemporaryFolder();
+    private StreamTableEnvironment tEnv;
+    private Table inputTable;
+
+    private final List<Row> samples =
+            Arrays.asList(
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(0., 5, 1.),
+                    Row.of(2., 6, 2.),
+                    Row.of(1., 7, 2.),
+                    Row.of(1., 5, 4.),
+                    Row.of(2., 5, 1.),
+                    Row.of(0., 5, 2.),
+                    Row.of(0., 5, 2.),
+                    Row.of(1., 9, 4.),
+                    Row.of(1., 9, 3.));

Review Comment:
   > We would expect that the functionality of Flink ML algorithms could be aligned with existing practices. Have you referenced the function of any other machine learning libraries?
   > 
   > If not, could you please reference the ChisqTest in [Alink](https://github.com/alibaba/Alink)? For example, Alink supports labels to be of integer type, while the current implementation in this PR only supports doubles.
   
   The `Chi-square test` is implemented in reference to Python's `Chi2_contingency` and Spark's `ChiSqTest`.
   
   You can encode the label(string/numerical value) with `StringIndexer`function before using the `Chi-square test`.
   I can change to support string and numerical if necessary.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on PR #132:
URL: https://github.com/apache/flink-ml/pull/132#issuecomment-1196269416

   Hi,@zhipeng93 , I updated the following according to your comments:
   
   1. Optimize the implementation for several operators.
   2. Reformat the document.
   3. Fix the test cases.
   4. Add validation for the size of the `inputs`.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] zhipeng93 commented on a diff in pull request #132: [FLINK-28571]Add Chi-squared test as Transformer to ml.feature

Posted by GitBox <gi...@apache.org>.
zhipeng93 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r929460234


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/feature/chisqtest/ChiSqTestTransformer.java:
##########
@@ -0,0 +1,498 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.feature.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.Transformer;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.common.param.HasInputCols;
+import org.apache.flink.ml.common.param.HasLabelCol;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTestTransformer
+        implements Transformer<ChiSqTestTransformer>,

Review Comment:
   Sounds good to me.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930906258


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();

Review Comment:
   > Could we check the size of the `inputs` here?
   
   I added a method to check the size of the inputs
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930906015


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";

Review Comment:
   > Could `bcCategoricalMarginsKey` and `bcLabelMarginsKey` be two local variables rather than class variables?
   
   I have put them in `transform`. 
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] taosiyuan163 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
taosiyuan163 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r930911825


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        // return a DataStream of the marginal sums of the factors
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                DataStreamUtils.mapPartition(
+                        observedFreq.keyBy(tuple -> new Tuple2<>(tuple.f0, tuple.f1).hashCode()),
+                        new MapPartitionFunction<
+                                Tuple4<String, Object, Object, Long>,
+                                Tuple3<String, Object, Long>>() {
+                            @Override
+                            public void mapPartition(
+                                    Iterable<Tuple4<String, Object, Object, Long>> iterable,
+                                    Collector<Tuple3<String, Object, Long>> out) {
+                                HashMap<Tuple2<String, Object>, Long> map = new HashMap<>();
+
+                                for (Tuple4<String, Object, Object, Long> tuple : iterable) {
+                                    Long observedFreq = tuple.f3;
+                                    Tuple2<String, Object> key = new Tuple2<>(tuple.f0, tuple.f1);
+
+                                    if (map.containsKey(key)) {
+                                        Long count = map.get(key);
+                                        map.put(key, count + observedFreq);
+                                    } else {
+                                        map.put(key, observedFreq);
+                                    }
+                                }
+
+                                for (Tuple2<String, Object> key : map.keySet()) {
+                                    Long categoricalMargin = map.get(key);
+                                    out.collect(new Tuple3<>(key.f0, key.f1, categoricalMargin));
+                                }
+                            }
+                        });
+
+        // return a DataStream of the marginal sums of the labels
+        DataStream<Tuple3<String, Object, Long>> labelMargins =

Review Comment:
   As above, I used `transform()` and `BoundedOneInput#endInput()` to optimize the implementation.



##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,492 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.MapPartitionFunction;
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.functions.RichMapPartitionFunction;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.common.datastream.DataStreamUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Chi-square test of independence of variables in a contingency table. This Transformer computes
+ * the chi-square statistic,p-value,and dof(number of degrees of freedom) for every feature in the
+ * contingency table, which constructed from the `observed` for each categorical values. All label
+ * and feature values must be categorical.
+ *
+ * <p>See: http://en.wikipedia.org/wiki/Chi-squared_test.
+ */
+public class ChiSqTest implements AlgoOperator<ChiSqTest>, ChiSqTestParams<ChiSqTest> {
+
+    final String bcCategoricalMarginsKey = "bcCategoricalMarginsKey";
+    final String bcLabelMarginsKey = "bcLabelMarginsKey";
+
+    private final Map<Param<?>, Object> paramMap = new HashMap<>();
+
+    public ChiSqTest() {
+        ParamUtils.initializeMapWithDefaultValues(paramMap, this);
+    }
+
+    @Override
+    public Table[] transform(Table... inputs) {
+
+        final String[] inputCols = getInputCols();
+        String labelCol = getLabelCol();
+
+        StreamTableEnvironment tEnv =
+                (StreamTableEnvironment) ((TableImpl) inputs[0]).getTableEnvironment();
+
+        SingleOutputStreamOperator<Tuple3<String, Object, Object>> colAndFeatureAndLabel =
+                tEnv.toDataStream(inputs[0])
+                        .flatMap(new ExtractColAndFeatureAndLabel(inputCols, labelCol));
+
+        // compute the observed frequencies
+        DataStream<Tuple4<String, Object, Object, Long>> observedFreq =
+                DataStreamUtils.mapPartition(
+                        colAndFeatureAndLabel.keyBy(Tuple3::hashCode),
+                        new GenerateObservedFrequencies());
+
+        SingleOutputStreamOperator<Tuple4<String, Object, Object, Long>> filledObservedFreq =
+                observedFreq
+                        .transform(
+                                "filledObservedFreq",
+                                Types.TUPLE(
+                                        Types.STRING,
+                                        Types.GENERIC(Object.class),
+                                        Types.GENERIC(Object.class),
+                                        Types.LONG),
+                                new FillZeroFunc())
+                        .setParallelism(1);
+
+        // return a DataStream of the marginal sums of the factors
+        DataStream<Tuple3<String, Object, Long>> categoricalMargins =
+                DataStreamUtils.mapPartition(

Review Comment:
   I used `transform()` and `BoundedOneInput#endInput()` to optimize the implementation.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] zhipeng93 commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
zhipeng93 commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r934161874


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTestParams.java:
##########
@@ -0,0 +1,11 @@
+package org.apache.flink.ml.stats.chisqtest;

Review Comment:
   nit: Please add the license header here.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-ml] yunfengzhou-hub commented on a diff in pull request #132: [FLINK-28571] Add AlgoOperator for Chi-squared test

Posted by GitBox <gi...@apache.org>.
yunfengzhou-hub commented on code in PR #132:
URL: https://github.com/apache/flink-ml/pull/132#discussion_r932927887


##########
flink-ml-lib/src/main/java/org/apache/flink/ml/stats/chisqtest/ChiSqTest.java:
##########
@@ -0,0 +1,655 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.ml.stats.chisqtest;
+
+import org.apache.flink.api.common.functions.RichFlatMapFunction;
+import org.apache.flink.api.common.functions.RichMapFunction;
+import org.apache.flink.api.common.state.ListState;
+import org.apache.flink.api.common.state.ListStateDescriptor;
+import org.apache.flink.api.common.typeinfo.TypeHint;
+import org.apache.flink.api.common.typeinfo.TypeInformation;
+import org.apache.flink.api.common.typeinfo.Types;
+import org.apache.flink.api.java.tuple.Tuple2;
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.api.java.tuple.Tuple4;
+import org.apache.flink.api.java.typeutils.RowTypeInfo;
+import org.apache.flink.iteration.operator.OperatorStateUtils;
+import org.apache.flink.ml.api.AlgoOperator;
+import org.apache.flink.ml.common.broadcast.BroadcastUtils;
+import org.apache.flink.ml.param.Param;
+import org.apache.flink.ml.util.ParamUtils;
+import org.apache.flink.ml.util.ReadWriteUtils;
+import org.apache.flink.runtime.state.StateInitializationContext;
+import org.apache.flink.runtime.state.StateSnapshotContext;
+import org.apache.flink.streaming.api.datastream.DataStream;
+import org.apache.flink.streaming.api.datastream.SingleOutputStreamOperator;
+import org.apache.flink.streaming.api.operators.AbstractStreamOperator;
+import org.apache.flink.streaming.api.operators.BoundedOneInput;
+import org.apache.flink.streaming.api.operators.OneInputStreamOperator;
+import org.apache.flink.streaming.runtime.streamrecord.StreamRecord;
+import org.apache.flink.table.api.Table;
+import org.apache.flink.table.api.bridge.java.StreamTableEnvironment;
+import org.apache.flink.table.api.internal.TableImpl;
+import org.apache.flink.types.Row;
+import org.apache.flink.util.Collector;
+import org.apache.flink.util.Preconditions;
+
+import org.apache.commons.math3.distribution.ChiSquaredDistribution;
+
+import java.io.IOException;
+import java.math.BigDecimal;
+import java.math.RoundingMode;
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.HashSet;
+import java.util.List;
+import java.util.Map;
+import java.util.Optional;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * An AlgoOperator which implements the Chi-square test algorithm.
+ *
+ * <p>Chi-square test of independence of variables in a contingency table. This function computes
+ * the chi-square statistic and p-value and dof(number of degrees of freedom) for every feature in
+ * the contingency table. The contingency table is constructed from the observed of categorical
+ * values.

Review Comment:
   nit: There are some grammar errors in this paragraph. How about refactoring it as follows:
   
   An AlgoOperator that computes the statistics of independence of variables in a contingency table. This function computes the chi-square statistic, p-value, and DOF(number of degrees of freedom) for every feature in the contingency table. The contingency table is constructed from the observed categorical values.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org