You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kyuubi.apache.org by ul...@apache.org on 2022/04/18 02:57:32 UTC

[incubator-kyuubi] branch master updated: [KYUUBI #2395] [DOC] Add Documentation for Spark AuthZ Extension

This is an automated email from the ASF dual-hosted git repository.

ulyssesyou pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-kyuubi.git


The following commit(s) were added to refs/heads/master by this push:
     new 8f29b4fd8 [KYUUBI #2395] [DOC] Add Documentation for Spark AuthZ Extension
8f29b4fd8 is described below

commit 8f29b4fd8f4de5c7e495bf9da5826883c59678e5
Author: Kent Yao <ya...@apache.org>
AuthorDate: Mon Apr 18 10:57:24 2022 +0800

    [KYUUBI #2395] [DOC] Add Documentation for Spark AuthZ Extension
    
    ### _Why are the changes needed?_
    
    ### _How was this patch tested?_
    - [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible
    
    - [ ] Add screenshots for manual tests if appropriate
    
    - [ ] [Run test](https://kyuubi.apache.org/docs/latest/develop_tools/testing.html#running-tests) locally before make a pull request
    
    Closes #2395 from yaooqinn/doc2.
    
    Closes #2395
    
    109440bf [Kent Yao] [DOC] Add Documentation for Spark AuthZ Extension
    852e7fd5 [Kent Yao] [DOC] Add Documentation for Spark AuthZ Extension
    dfeef884 [Kent Yao] [DOC] Add Documentation for Spark AuthZ Extension
    
    Authored-by: Kent Yao <ya...@apache.org>
    Signed-off-by: ulysses-you <ul...@apache.org>
---
 docs/security/authorization.md                     |  49 --------
 docs/security/{ => authorization}/index.rst        |  14 +--
 docs/security/authorization/spark/build.md         | 104 +++++++++++++++++
 docs/security/{ => authorization/spark}/index.rst  |  16 ++-
 docs/security/authorization/spark/install.md       | 126 +++++++++++++++++++++
 docs/security/authorization/spark/overview.md      |  57 ++++++++++
 docs/security/index.rst                            |   8 +-
 .../spark/authz/ranger/RangerSparkExtension.scala  |   2 +-
 8 files changed, 305 insertions(+), 71 deletions(-)

diff --git a/docs/security/authorization.md b/docs/security/authorization.md
deleted file mode 100644
index 15078088f..000000000
--- a/docs/security/authorization.md
+++ /dev/null
@@ -1,49 +0,0 @@
-<!--
- - Licensed to the Apache Software Foundation (ASF) under one or more
- - contributor license agreements.  See the NOTICE file distributed with
- - this work for additional information regarding copyright ownership.
- - The ASF licenses this file to You under the Apache License, Version 2.0
- - (the "License"); you may not use this file except in compliance with
- - the License.  You may obtain a copy of the License at
- -
- -   http://www.apache.org/licenses/LICENSE-2.0
- -
- - Unless required by applicable law or agreed to in writing, software
- - distributed under the License is distributed on an "AS IS" BASIS,
- - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- - See the License for the specific language governing permissions and
- - limitations under the License.
- -->
-
-# ACL Management Guide
-
-- [Authorization Modes](#1)
-    - [Storage-Based Authorization](#1.1)
-    - [SQL-Standard Based Authorization](#1.2)
-    - [Ranger Security Support](#1.3)
-
-
-<h2 id="1">Authorization Modes</h2>
-
-Three primary modes for Kyuubi authorization are available by [Submarine Spark Security](https://github.com/apache/submarine/tree/master/submarine-security/spark-security):
-
-<h4 id="1.1">Storage-Based Authorization</h4>
-
-Enabling Storage Based Authorization in the `Hive Metastore Server` uses the HDFS permissions to act as the main source for verification and allows for consistent data and metadata authorization policy. This allows control over metadata access by verifying if the user has permission to access corresponding directories on the HDFS. Similar with `HiveServer2`, files and directories will be translated into hive metadata objects, such as dbs, tables, partitions, and be protected from end use [...]
-
-Storage-Based Authorization offers users with Database, Table and Partition-level coarse-gained access control.
-
-<h4 id="1.2">SQL-Standard Based Authorization</h4>
-
-Enabling SQL-Standard Based Authorization gives users more fine-gained control over access comparing with Storage Based Authorization. Besides of the ability of Storage Based Authorization,  SQL-Standard Based Authorization can improve it to Views and Column-level. Unfortunately, Spark SQL does not support grant/revoke statements which controls access, this might be done only through the  HiveServer2. But it's gratifying that [Submarine Spark Security](https://github.com/apache/submarine [...]
-
-With [Kyuubi](https://github.com/apache/incubator-kyuubi), the SQL-Standard Based Authorization is guaranteed for the security configurations, metadata, and storage information is preserved from end users.
-
-Please refer to the [Submarine Spark Security](https://submarine.apache.org/docs/userDocs/submarine-security/spark-security/README) in the online documentation for an overview on how to configure SQL-Standard Based Authorization for Spark SQL.
-
-<h4 id="1.3">Ranger Security Support (Recommended)</h4>
-
-[Apache Ranger](https://ranger.apache.org/)  is a framework to enable, monitor and manage comprehensive data security across the Hadoop platform but end before Spark or Spark SQL. The [Submarine Spark Security](https://github.com/apache/submarine/tree/master/submarine-security/spark-security) enables Kyuubi with control access ability reusing [Ranger Plugin for Hive MetaStore
-](https://cwiki.apache.org/confluence/display/RANGER/Ranger+Plugin+for+Hive+MetaStore). [Apache Ranger](https://ranger.apache.org/) makes the scope of existing SQL-Standard Based Authorization expanded but without supporting Spark SQL. [Submarine Spark Security](https://github.com/apache/submarine/tree/master/submarine-security/spark-security) sticks them together.
-
-Please refer to the [Submarine Spark Security](https://submarine.apache.org/docs/userDocs/submarine-security/spark-security/README) in the online documentation for an overview on how to configure Ranger for Spark SQL.
diff --git a/docs/security/index.rst b/docs/security/authorization/index.rst
similarity index 80%
copy from docs/security/index.rst
copy to docs/security/authorization/index.rst
index 63e6d896c..f7241847a 100644
--- a/docs/security/index.rst
+++ b/docs/security/authorization/index.rst
@@ -13,18 +13,14 @@
    See the License for the specific language governing permissions and
    limitations under the License.
 
-.. image:: ../imgs/kyuubi_logo.png
+.. image:: https://svn.apache.org/repos/asf/comdev/project-logos/originals/kyuubi-1.svg
    :align: center
+   :width: 25%
 
-Kyuubi Security Overview
-========================
+Kyuubi Authorization Guide
+==========================
 
 .. toctree::
     :maxdepth: 2
-    :numbered: 3
-
-    Authentication <authentication>
-    kinit
-    hadoop_credentials_manager
-    authorization
 
+    Spark AuthZ Plugin <spark/index>
diff --git a/docs/security/authorization/spark/build.md b/docs/security/authorization/spark/build.md
new file mode 100644
index 000000000..c4d71d1aa
--- /dev/null
+++ b/docs/security/authorization/spark/build.md
@@ -0,0 +1,104 @@
+<!--
+ - Licensed to the Apache Software Foundation (ASF) under one or more
+ - contributor license agreements.  See the NOTICE file distributed with
+ - this work for additional information regarding copyright ownership.
+ - The ASF licenses this file to You under the Apache License, Version 2.0
+ - (the "License"); you may not use this file except in compliance with
+ - the License.  You may obtain a copy of the License at
+ -
+ -   http://www.apache.org/licenses/LICENSE-2.0
+ -
+ - Unless required by applicable law or agreed to in writing, software
+ - distributed under the License is distributed on an "AS IS" BASIS,
+ - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ - See the License for the specific language governing permissions and
+ - limitations under the License.
+ -->
+
+
+
+# Building Kyuubi Spark AuthZ Plugin
+
+<img src="https://svn.apache.org/repos/asf/comdev/project-logos/originals/kyuubi-1.svg" alt="Kyuubi logo" width="50%" align="right" />
+
+## Build with Apache Maven
+
+Kyuubi Spark AuthZ Plugin is built using [Apache Maven](http://maven.apache.org).
+To build it, `cd` to the root direct of kyuubi project and run:
+
+```shell
+build/mvn clean package -pl :kyuubi-spark-authz_2.12 -DskipTests
+```
+
+After a while, if everything goes well, you will get the plugin finally in two parts:
+
+- The main plugin jar, which is under `./extensions/spark/kyuubi-spark-authz/target/kyuubi-spark-authz_${scala.binary.version}-${project.version}.jar`
+- The least transitive dependencies needed, which are under `./extensions/spark/kyuubi-spark-authz/target/scala-${scala.binary.version}/jars`
+
+### Build against Different Apache Spark Versions
+
+The maven option `spark.version` is used for specifying Spark version to compile with and generate corresponding transitive dependencies.
+By default, it is always built with the latest `spark.version` defined in kyuubi project main pom file.
+Sometimes, it may be incompatible with other Spark distributions, then you may need to build the plugin on your own targeting the Spark version you use.
+
+For example,
+
+```shell
+build/mvn clean package -pl :kyuubi-spark-authz_2.12 -DskipTests -Dspark.version=3.0.2
+```
+
+The available `spark.version`s are shown in the following table.
+
+|   Spark Version   |  Supported  |                                                              Remark                                                              |
+|:-----------------:|:-----------:|:--------------------------------------------------------------------------------------------------------------------------------:|
+|      master       |      √      |                                                                -                                                                 |
+|       3.3.x       |      √      |                                                                -                                                                 |
+|       3.2.x       |      √      |                                                                -                                                                 |
+|       3.1.x       |      √      |                                                                -                                                                 |
+|       3.0.x       |      √      |                                                                -                                                                 |
+|       2.4.x       |      √      |                                                                -                                                                 |
+| 2.3.x and earlier |      ×      | [PR 2367](https://github.com/apache/incubator-kyuubi/pull/2367) is used to track how we work with older releases with scala 2.11 |
+
+Currently, Spark released with Scala 2.12 are supported.
+
+### Build against Different Apache Ranger Versions
+
+The maven option `ranger.version` is used for specifying Ranger version to compile with and generate corresponding transitive dependencies.
+By default, it is always built with the latest `ranger.version` defined in kyuubi project main pom file.
+Sometimes, it may be incompatible with other Ranger Admins, then you may need to build the plugin on your own targeting the Ranger Admin version you connect with.
+
+```shell
+build/mvn clean package -pl :kyuubi-spark-authz_2.12 -DskipTests -Dranger.version=0.7.0
+```
+
+The available `ranger.version`s are shown in the following table.
+
+| Ranger Version |  Supported  | Remark |
+|:--------------:|:-----------:|:------:|
+|     2.2.x      |      √      |   -    |
+|     2.1.x      |      √      |   -    |
+|     2.0.x      |      √      |   -    |
+|     1.2.x      |      √      |   -    |
+|     1.1.x      |      √      |   -    |
+|     1.0.x      |      √      |   -    |
+|     0.7.x      |      √      |   -    |
+|     0.6.x      |      √      |   -    |
+
+Currently, all ranger releases are supported.
+
+## Test with ScalaTest Maven plugin
+If you omit `-DskipTests` option in the command above, you will also get all unit tests run.
+
+```shell
+build/mvn clean package -pl :kyuubi-spark-authz_2.12
+```
+
+If any bug occurs and you want to debug the plugin yourself, you can configure `-DdebugForkedProcess=true` and `-DdebuggerPort=5005`(optional).
+
+```shell
+build/mvn clean package -pl :kyuubi-spark-authz_2.12 -DdebugForkedProcess=true
+```
+
+The tests will suspend at startup and wait for a remote debugger to attach to the configured port.
+
+We will appreciate if you can share the bug or the fix to the Kyuubi community.
diff --git a/docs/security/index.rst b/docs/security/authorization/spark/index.rst
similarity index 78%
copy from docs/security/index.rst
copy to docs/security/authorization/spark/index.rst
index 63e6d896c..73d0f3475 100644
--- a/docs/security/index.rst
+++ b/docs/security/authorization/spark/index.rst
@@ -13,18 +13,16 @@
    See the License for the specific language governing permissions and
    limitations under the License.
 
-.. image:: ../imgs/kyuubi_logo.png
+.. image:: https://svn.apache.org/repos/asf/comdev/project-logos/originals/kyuubi-1.svg
    :align: center
+   :width: 25%
 
-Kyuubi Security Overview
-========================
+Kyuubi Spark AuthZ Plugin
+=========================
 
 .. toctree::
     :maxdepth: 2
-    :numbered: 3
-
-    Authentication <authentication>
-    kinit
-    hadoop_credentials_manager
-    authorization
 
+    Overview <overview>
+    Building <build>
+    Installing <install>
diff --git a/docs/security/authorization/spark/install.md b/docs/security/authorization/spark/install.md
new file mode 100644
index 000000000..5d3dc22a8
--- /dev/null
+++ b/docs/security/authorization/spark/install.md
@@ -0,0 +1,126 @@
+<!--
+ - Licensed to the Apache Software Foundation (ASF) under one or more
+ - contributor license agreements.  See the NOTICE file distributed with
+ - this work for additional information regarding copyright ownership.
+ - The ASF licenses this file to You under the Apache License, Version 2.0
+ - (the "License"); you may not use this file except in compliance with
+ - the License.  You may obtain a copy of the License at
+ -
+ -   http://www.apache.org/licenses/LICENSE-2.0
+ -
+ - Unless required by applicable law or agreed to in writing, software
+ - distributed under the License is distributed on an "AS IS" BASIS,
+ - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ - See the License for the specific language governing permissions and
+ - limitations under the License.
+ -->
+
+<img src="https://svn.apache.org/repos/asf/comdev/project-logos/originals/kyuubi-1.svg" alt="Kyuubi logo" width="25%" align="center" />
+
+
+# Installing and Configuring Kyuubi Spark AuthZ Plugin
+
+## Pre-install
+
+- [Apache Ranger](https://ranger.apache.org/)
+
+  This plugin works as a ranger rest client with Apache Ranger admin server to do privilege check.
+  Thus, a ranger server need to be installed ahead and available to use.
+
+- Building(optional)
+
+  If your ranger admin or spark distribution is not compatible with the official pre-built [artifact](https://mvnrepository.com/artifact/org.apache.kyuubi/kyuubi-spark-authz) in maven central.
+  You need to [build](build.md) the plugin targeting the spark/ranger you are using by yourself.
+
+## Install
+
+With the `kyuubi-spark-authz_*.jar` and its transitive dependencies available for spark runtime classpath, such as
+- Copied to `$SPARK_HOME/jars`, or
+- Specified to `spark.jars` configuration
+
+## Configure
+
+### Settings for Connecting Ranger Admin
+
+#### ranger-spark-security.xml
+- Create `ranger-spark-security.xml` in `$SPARK_HOME/conf` and add the following configurations
+for pointing to the right Ranger admin server.
+
+```xml
+<configuration>
+    <property>
+        <name>ranger.plugin.spark.policy.rest.url</name>
+        <value>ranger admin address like http://ranger-admin.org:6080</value>
+    </property>
+
+    <property>
+        <name>ranger.plugin.spark.service.name</name>
+        <value>a ranger hive service name</value>
+    </property>
+
+    <property>
+        <name>ranger.plugin.spark.policy.cache.dir</name>
+        <value>./a ranger hive service name/policycache</value>
+    </property>
+
+    <property>
+        <name>ranger.plugin.spark.policy.pollIntervalMs</name>
+        <value>5000</value>
+    </property>
+
+    <property>
+        <name>ranger.plugin.spark.policy.source.impl</name>
+        <value>org.apache.ranger.admin.client.RangerAdminRESTClient</value>
+    </property>
+
+</configuration>
+```
+
+#### ranger-spark-audit.xml
+
+Create `ranger-spark-audit.xml` in `$SPARK_HOME/conf` and add the following configurations
+to enable/disable auditing.
+
+```xml
+<configuration>
+
+    <property>
+        <name>xasecure.audit.is.enabled</name>
+        <value>true</value>
+    </property>
+
+    <property>
+        <name>xasecure.audit.destination.db</name>
+        <value>false</value>
+    </property>
+
+    <property>
+        <name>xasecure.audit.destination.db.jdbc.driver</name>
+        <value>com.mysql.jdbc.Driver</value>
+    </property>
+
+    <property>
+        <name>xasecure.audit.destination.db.jdbc.url</name>
+        <value>jdbc:mysql://10.171.161.78/ranger</value>
+    </property>
+
+    <property>
+        <name>xasecure.audit.destination.db.password</name>
+        <value>rangeradmin</value>
+    </property>
+
+    <property>
+        <name>xasecure.audit.destination.db.user</name>
+        <value>rangeradmin</value>
+    </property>
+
+</configuration>
+```
+
+### Settings for Spark Session Extensions
+
+Add `org.apache.kyuubi.plugin.spark.authz.ranger.RangerSparkExtension` to the spark configuration `spark.sql.extensions`.
+
+```properties
+spark.sql.extensions=org.apache.kyuubi.plugin.spark.authz.ranger.RangerSparkExtension
+```
diff --git a/docs/security/authorization/spark/overview.md b/docs/security/authorization/spark/overview.md
new file mode 100644
index 000000000..52cae3880
--- /dev/null
+++ b/docs/security/authorization/spark/overview.md
@@ -0,0 +1,57 @@
+<!--
+ - Licensed to the Apache Software Foundation (ASF) under one or more
+ - contributor license agreements.  See the NOTICE file distributed with
+ - this work for additional information regarding copyright ownership.
+ - The ASF licenses this file to You under the Apache License, Version 2.0
+ - (the "License"); you may not use this file except in compliance with
+ - the License.  You may obtain a copy of the License at
+ -
+ -   http://www.apache.org/licenses/LICENSE-2.0
+ -
+ - Unless required by applicable law or agreed to in writing, software
+ - distributed under the License is distributed on an "AS IS" BASIS,
+ - WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ - See the License for the specific language governing permissions and
+ - limitations under the License.
+ -->
+
+# Kyuubi AuthZ Plugin For Spark SQL
+
+<img src="https://svn.apache.org/repos/asf/comdev/project-logos/originals/kyuubi-1.svg" alt="Kyuubi logo" width="25%" align="right" />
+
+Security is one of the fundamental features for enterprise adoption with Kyuubi.
+When deploying Kyuubi against secured clusters,
+storage-based authorization is enabled by default, which only provides file-level coarse-grained authorization mode.
+When row/column-level fine-grained access control is required,
+we can enhance the data access model with the Kyuubi Spark AuthZ plugin.
+
+## Authorization in Kyuubi
+
+### Storage-based Authorization
+
+As Kyuubi supports multi tenancy, a tenant can only visit authorized resources,
+including computing resources, data, etc.
+Most file systems, such as HDFS, support ACL management based on files and directories.
+
+A so called Storage-based authorization mode is supported by Kyuubi by default.
+In this model, all objects, such as databases, tables, partitions, in meta layer are mapping to folders or files in the storage layer,
+as well as their permissions.
+
+Storage-based authorization offers users with database, table and partition-level coarse-gained access control.
+
+### SQL-standard authorization with Ranger
+
+A SQL-standard authorization usually offers a row/colum-level fine-grained access control to meet the real-world data security need.
+
+[Apache Ranger](https://ranger.apache.org/) is a framework to enable, monitor and manage comprehensive data security across the Hadoop platform. 
+This plugin enables Kyuubi with data and metadata control access ability for Spark SQL Engines, including,
+
+- Column-level fine-grained authorization
+- Row-level fine-grained authorization, a.k.a. Row-level filtering
+- Data masking
+
+## The Plugin Itself
+
+Kyuubi Spark Authz Plugin itself provides general purpose for ACL management for data & metadata while using Spark SQL.
+It is not necessary to deploy it with the Kyuubi server and engine, and can be used as an extension for any Spark SQL jobs.
+However, the authorization always requires a robust authentication layer and multi tenancy support, so Kyuubi is a perfect match.
diff --git a/docs/security/index.rst b/docs/security/index.rst
index 63e6d896c..2e8cee822 100644
--- a/docs/security/index.rst
+++ b/docs/security/index.rst
@@ -13,18 +13,20 @@
    See the License for the specific language governing permissions and
    limitations under the License.
 
-.. image:: ../imgs/kyuubi_logo.png
+.. image:: https://svn.apache.org/repos/asf/comdev/project-logos/originals/kyuubi-1.svg
    :align: center
+   :width: 25%
 
 Kyuubi Security Overview
 ========================
 
+Securing Kyuubi involves enabling authentication(authn), authorization(authz) and encryption, etc.
+
 .. toctree::
     :maxdepth: 2
-    :numbered: 3
 
     Authentication <authentication>
+    Authorization <authorization/index>
     kinit
     hadoop_credentials_manager
-    authorization
 
diff --git a/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala b/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala
index daaa6754a..979fc550b 100644
--- a/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala
+++ b/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala
@@ -26,7 +26,7 @@ import org.apache.kyuubi.plugin.spark.authz.util.RuleEliminateMarker
  * <ul>
  *   <li>Table/Column level authorization(yes)</li>
  *   <li>Row level filtering(yes)</li>
- *   <li>Data masking(no)</li>
+ *   <li>Data masking(yes)</li>
  * <ul>
  *
  * To work with Spark SQL, we need to enable it via spark extensions