You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2020/04/13 01:02:33 UTC

[spark] branch master updated: [SPARK-31330] Automatically label PRs based on the paths they touch

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 1b87015  [SPARK-31330] Automatically label PRs based on the paths they touch
1b87015 is described below

commit 1b8701504457ec291e6221dd63eec6e6be999150
Author: Nicholas Chammas <ni...@gmail.com>
AuthorDate: Mon Apr 13 10:01:31 2020 +0900

    [SPARK-31330] Automatically label PRs based on the paths they touch
    
    ### What changes were proposed in this pull request?
    
    This PR adds some rules that will be used by Probot Auto Labeler to label PRs based on what paths they modify.
    
    ### Why are the changes needed?
    
    This should make it easier for committers to organize PRs, and it could also help drive downstream tooling like the PR dashboard.
    
    ### Does this PR introduce any user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    We'll only be able to test it, I believe, after merging it in. Given that [the Avro project is using this same bot already](https://github.com/apache/avro/blob/master/.github/autolabeler.yml), I expect it will be straightforward to get this working.
    
    Closes #28114 from nchammas/SPARK-31330-auto-label-prs.
    
    Lead-authored-by: Nicholas Chammas <ni...@gmail.com>
    Co-authored-by: HyukjinKwon <gu...@apache.org>
    Co-authored-by: Nicholas Chammas <ni...@liveramp.com>
    Signed-off-by: HyukjinKwon <gu...@apache.org>
---
 .github/autolabeler.yml | 131 ++++++++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 131 insertions(+)

diff --git a/.github/autolabeler.yml b/.github/autolabeler.yml
new file mode 100644
index 0000000..e842090
--- /dev/null
+++ b/.github/autolabeler.yml
@@ -0,0 +1,131 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    https://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+# Bot page: https://github.com/apps/probot-autolabeler
+# The matching patterns follow the .gitignore spec.
+# See: https://git-scm.com/docs/gitignore#_pattern_format
+# Also, note that the plugin uses 'ignore' package. See also
+# https://github.com/kaelzhang/node-ignore
+INFRA:
+  - ".github/"
+  - "appveyor.yml"
+  - "/tools/"
+  - "/dev/create-release/"
+  - ".asf.yaml"
+  - ".gitattributes"
+  - ".gitignore"
+  - "/dev/github_jira_sync.py"
+  - "/dev/merge_spark_pr.py"
+  - "/dev/run-tests-jenkins*"
+BUILD:
+  - "/dev/"
+  - "!/dev/github_jira_sync.py"
+  - "!/dev/merge_spark_pr.py"
+  - "!/dev/run-tests-jenkins*"
+  - "/build/"
+  - "/project/"
+  - "/assembly/"
+  - "*pom.xml"
+  - "/bin/docker-image-tool.sh"
+  - "/bin/find-spark-home*"
+  - "scalastyle-config.xml"
+DOCS:
+  - "docs/"
+  - "/README.md"
+  - "/CONTRIBUTING.md"
+EXAMPLES:
+  - "examples/"
+  - "/bin/run-example*"
+CORE:
+  - "/core/"
+  - "/common/kvstore/"
+  - "/common/network-common/"
+  - "/common/network-shuffle/"
+  - "/python/pyspark/*.py"
+  - "/python/pyspark/tests/*.py"
+  - "/sbin/*master*.sh"
+  - "/sbin/*slave*.sh"
+  - "/sbin/spark-config.sh"
+  - "/sbin/*daemon*.sh"
+  - "/sbin/*history*.sh"
+  - "/sbin/*mesos*.sh"
+SPARK SUBMIT:
+  - "/bin/spark-submit*"
+SPARK SHELL:
+  - "/repl/"
+SQL:
+  - "sql/"
+  - "/common/unsafe/"
+  - "!/python/pyspark/sql/avro/"
+  - "!/python/pyspark/sql/streaming.py"
+  - "!/python/pyspark/sql/tests/test_streaming.py"
+  - "/bin/spark-sql*"
+  - "/bin/beeline*"
+  - "/sbin/*thriftserver*.sh"
+  - "*SQL*.R"
+  - "DataFrame.R"
+  - "WindowSpec.R"
+  - "catalog.R"
+  - "column.R"
+  - "functions.R"
+  - "group.R"
+  - "schema.R"
+  - "types.R"
+AVRO:
+  - "/external/avro/"
+  - "/python/pyspark/sql/avro/"
+DSTREAM:
+  - "/streaming/"
+  - "/data/streaming/"
+  - "/external/flume*"
+  - "/external/kinesis*"
+  - "/external/kafka*"
+  - "/python/pyspark/streaming/"
+GRAPHX:
+  - "/graphx/"
+  - "/data/graphx/"
+ML:
+  - "ml/"
+  - "*mllib_*.R"
+MLLIB:
+  - "spark/mllib/"
+  - "/mllib-local/"
+  - "/python/pyspark/mllib/"
+STRUCTURED STREAMING:
+  - "sql/**/streaming/"
+  - "/external/kafka-0-10-sql/"
+  - "/python/pyspark/sql/streaming.py"
+  - "/python/pyspark/sql/tests/test_streaming.py"
+  - "*streaming.R"
+PYTHON:
+  - "/bin/pyspark*"
+  - "python/"
+R:
+  - "r/"
+  - "R/"
+  - "/bin/sparkR*"
+YARN:
+  - "/resource-managers/yarn/"
+MESOS:
+  - "/resource-managers/mesos/"
+KUBERNETES:
+  - "/resource-managers/kubernetes/"
+WINDOWS:
+  - "*.cmd"
+  - "/R/pkg/tests/fulltests/test_Windows.R"
+WEB UI:
+  - "ui/"


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org