You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@submarine.apache.org by li...@apache.org on 2021/11/11 08:59:26 UTC

[submarine] branch master updated: SUBMARINE-1078. Retire Spark security module from submarine project

This is an automated email from the ASF dual-hosted git repository.

liuxun pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/submarine.git


The following commit(s) were added to refs/heads/master by this push:
     new 50ae483  SUBMARINE-1078. Retire Spark security module from submarine project
50ae483 is described below

commit 50ae48339df4a11d115ac4b76b363da9796a3260
Author: Kent Yao <ya...@apache.org>
AuthorDate: Thu Nov 11 16:33:03 2021 +0800

    SUBMARINE-1078. Retire Spark security module from submarine project
    
    ### What is this PR for?
    <!-- A few sentences describing the overall goals of the pull request's commits.
    First time? Check out the contributing guide - https://submarine.apache.org/contribution/contributions.html
    -->
    Retire Spark security module from the submarine project
    
    ### What type of PR is it?
    
    Refactoring
    
    ### Todos
    
    None
    
    ### What is the Jira issue?
    <!-- * Open an issue on Jira https://issues.apache.org/jira/browse/SUBMARINE/
    * Put link here, and add [SUBMARINE-*Jira number*] in PR title, eg. `SUBMARINE-23. PR title`
    -->
    SUBMARINE-1078
    
    ### How should this be tested?
    <!--
    * First time? Setup Travis CI as described on https://submarine.apache.org/contribution/contributions.html#continuous-integration
    * Strongly recommended: add automated unit tests for any new or changed behavior
    * Outline any manual steps to test the PR here.
    -->
    Passing GA
    
    ### Screenshots (if appropriate)
    
    ### Questions:
    * Do the license files need updating? No
    * Are there breaking changes for older versions? Yes
    * Does this need new documentation? No
    
    Author: Kent Yao <ya...@apache.org>
    
    Signed-off-by: Liu Xun <li...@apache.org>
    
    Closes #796 from yaooqinn/SUBMARINE-1078 and squashes the following commits:
    
    64ec345 [Kent Yao] submarine
    1657856 [Kent Yao] SUBMARINE-1078. Retire Spark security module from submarine project
---
 .github/workflows/master.yml                       |   37 +-
 .gitignore                                         |    3 -
 pom.xml                                            |    4 -
 submarine-security/spark-security/pom.xml          |  642 -----
 .../CreateRoleCommand.scala                        |   29 -
 .../DropRoleCommand.scala                          |   29 -
 .../ShowCurrentRolesCommand.scala                  |   30 -
 .../ShowRolesCommand.scala                         |   29 -
 .../RangerSparkPlugin.scala                        |   59 -
 .../RangerAdminClientImpl.scala                    |   58 -
 .../CommandUtils.scala                             |   33 -
 .../CreateRoleCommand.scala                        |   59 -
 .../DropRoleCommand.scala                          |   45 -
 .../ShowCurrentRolesCommand.scala                  |   51 -
 .../ShowRolesCommand.scala                         |   50 -
 .../RangerSparkPlugin.scala                        |   59 -
 .../RangerAdminClientImpl.scala                    |   58 -
 .../CommandUtils.scala                             |   33 -
 .../CreateRoleCommand.scala                        |   59 -
 .../DropRoleCommand.scala                          |   45 -
 .../ShowCurrentRolesCommand.scala                  |   51 -
 .../ShowRolesCommand.scala                         |   50 -
 .../RangerSparkPlugin.scala                        |   57 -
 .../RangerAdminClientImpl.scala                    |   77 -
 .../spark/compatible/CompatibleFunc.scala          |   34 -
 .../spark/compatible/SubqueryCompatible.scala      |   34 -
 .../compatible/command/CompatibleCommand.scala     |   33 -
 .../spark/security/parser/SubmarineSqlParser.scala |  107 -
 .../spark/compatible/CompatibleFunc.scala          |   36 -
 .../spark/compatible/SubqueryCompatible.scala      |   29 -
 .../compatible/command/CompatibleCommand.scala     |   36 -
 .../spark/security/parser/SubmarineSqlParser.scala |  116 -
 .../spark/security/parser/SubmarineSqlBase.g4      |  114 -
 .../scala/org/apache/spark/sql/AuthzUtils.scala    |   47 -
 .../SubmarineConfigurationCheckExtension.scala     |   50 -
 .../optimizer/SubmarineDataMaskingExtension.scala  |  268 --
 .../SubmarinePushPredicatesThroughExtensions.scala |   34 -
 .../optimizer/SubmarineRowFilterExtension.scala    |  133 -
 .../optimizer/SubmarineSparkOptimizer.scala        |   39 -
 ...ubmarineSparkRangerAuthorizationExtension.scala |  173 --
 .../plans/logical/SubmarineDataMasking.scala       |   30 -
 .../plans/logical/SubmarineRowFilter.scala         |   31 -
 .../execution/SubmarineShowDatabasesCommand.scala  |   45 -
 .../sql/execution/SubmarineShowTablesCommand.scala |   41 -
 .../execution/SubmarineSparkPlanOmitStrategy.scala |   35 -
 .../execution/command/SubmarineResetCommand.scala  |   38 -
 .../apache/spark/sql/hive/PrivilegesBuilder.scala  |  467 ----
 .../spark/security/RangerSparkAccessRequest.scala  |   91 -
 .../spark/security/RangerSparkAuditHandler.scala   |   28 -
 .../spark/security/RangerSparkAuthorizer.scala     |  303 ---
 .../spark/security/RangerSparkResource.scala       |   93 -
 .../security/SparkAccessControlException.scala     |   27 -
 .../submarine/spark/security/SparkAccessType.scala |   37 -
 .../submarine/spark/security/SparkObjectType.scala |   32 -
 .../spark/security/SparkOperationType.scala        |   63 -
 .../spark/security/SparkPrivObjectActionType.scala |   28 -
 .../spark/security/SparkPrivilegeObject.scala      |  138 -
 .../spark/security/SparkPrivilegeObjectType.scala  |   29 -
 .../security/api/RangerSparkAuthzExtension.scala   |   42 -
 .../security/api/RangerSparkDCLExtension.scala     |   62 -
 .../security/api/RangerSparkSQLExtension.scala     |   49 -
 .../apache/submarine/spark/security/package.scala  |   28 -
 .../security/parser/SubmarineSqlAstBuilder.scala   |   48 -
 .../security/parser/UpperCaseCharStream.scala      |   59 -
 .../src/test/resources/data/files/kv1.txt          |  500 ----
 .../src/test/resources/log4j.properties            |   25 -
 .../src/test/resources/ranger-spark-audit.xml      |   31 -
 .../src/test/resources/ranger-spark-security.xml   |   45 -
 .../src/test/resources/sparkSql_hive_jenkins.json  | 2680 --------------------
 .../spark-security/src/test/resources/tpcds/q1.sql |   34 -
 .../src/test/resources/tpcds/q10.sql               |   72 -
 .../src/test/resources/tpcds/q11.sql               |   83 -
 .../src/test/resources/tpcds/q12.sql               |   37 -
 .../src/test/resources/tpcds/q13.sql               |   64 -
 .../src/test/resources/tpcds/q14a.sql              |  135 -
 .../src/test/resources/tpcds/q14b.sql              |  110 -
 .../src/test/resources/tpcds/q15.sql               |   30 -
 .../src/test/resources/tpcds/q16.sql               |   38 -
 .../src/test/resources/tpcds/q17.sql               |   48 -
 .../src/test/resources/tpcds/q18.sql               |   43 -
 .../src/test/resources/tpcds/q19.sql               |   34 -
 .../spark-security/src/test/resources/tpcds/q2.sql |   96 -
 .../src/test/resources/tpcds/q20.sql               |   33 -
 .../src/test/resources/tpcds/q21.sql               |   40 -
 .../src/test/resources/tpcds/q22.sql               |   29 -
 .../src/test/resources/tpcds/q23a.sql              |   68 -
 .../src/test/resources/tpcds/q23b.sql              |   83 -
 .../src/test/resources/tpcds/q24a.sql              |   49 -
 .../src/test/resources/tpcds/q24b.sql              |   49 -
 .../src/test/resources/tpcds/q25.sql               |   48 -
 .../src/test/resources/tpcds/q26.sql               |   34 -
 .../src/test/resources/tpcds/q27.sql               |   36 -
 .../src/test/resources/tpcds/q28.sql               |   71 -
 .../src/test/resources/tpcds/q29.sql               |   47 -
 .../spark-security/src/test/resources/tpcds/q3.sql |   28 -
 .../src/test/resources/tpcds/q30.sql               |   50 -
 .../src/test/resources/tpcds/q31.sql               |   75 -
 .../src/test/resources/tpcds/q32.sql               |   30 -
 .../src/test/resources/tpcds/q33.sql               |   80 -
 .../src/test/resources/tpcds/q34.sql               |   47 -
 .../src/test/resources/tpcds/q35.sql               |   61 -
 .../src/test/resources/tpcds/q36.sql               |   41 -
 .../src/test/resources/tpcds/q37.sql               |   30 -
 .../src/test/resources/tpcds/q38.sql               |   45 -
 .../src/test/resources/tpcds/q39a.sql              |   62 -
 .../src/test/resources/tpcds/q39b.sql              |   63 -
 .../spark-security/src/test/resources/tpcds/q4.sql |  135 -
 .../src/test/resources/tpcds/q40.sql               |   40 -
 .../src/test/resources/tpcds/q41.sql               |   64 -
 .../src/test/resources/tpcds/q42.sql               |   33 -
 .../src/test/resources/tpcds/q43.sql               |   48 -
 .../src/test/resources/tpcds/q44.sql               |   61 -
 .../src/test/resources/tpcds/q45.sql               |   36 -
 .../src/test/resources/tpcds/q46.sql               |   47 -
 .../src/test/resources/tpcds/q47.sql               |   78 -
 .../src/test/resources/tpcds/q48.sql               |   78 -
 .../src/test/resources/tpcds/q49.sql               |  141 -
 .../spark-security/src/test/resources/tpcds/q5.sql |  146 --
 .../src/test/resources/tpcds/q50.sql               |   62 -
 .../src/test/resources/tpcds/q51.sql               |   70 -
 .../src/test/resources/tpcds/q52.sql               |   29 -
 .../src/test/resources/tpcds/q53.sql               |   45 -
 .../src/test/resources/tpcds/q54.sql               |   76 -
 .../src/test/resources/tpcds/q55.sql               |   28 -
 .../src/test/resources/tpcds/q56.sql               |   80 -
 .../src/test/resources/tpcds/q57.sql               |   71 -
 .../src/test/resources/tpcds/q58.sql               |   74 -
 .../src/test/resources/tpcds/q59.sql               |   90 -
 .../spark-security/src/test/resources/tpcds/q6.sql |   36 -
 .../src/test/resources/tpcds/q60.sql               |   77 -
 .../src/test/resources/tpcds/q61.sql               |   48 -
 .../src/test/resources/tpcds/q62.sql               |   50 -
 .../src/test/resources/tpcds/q63.sql               |   46 -
 .../src/test/resources/tpcds/q64.sql               |  107 -
 .../src/test/resources/tpcds/q65.sql               |   48 -
 .../src/test/resources/tpcds/q66.sql               |  255 --
 .../src/test/resources/tpcds/q67.sql               |   53 -
 .../src/test/resources/tpcds/q68.sql               |   49 -
 .../src/test/resources/tpcds/q69.sql               |   53 -
 .../spark-security/src/test/resources/tpcds/q7.sql |   34 -
 .../src/test/resources/tpcds/q70.sql               |   53 -
 .../src/test/resources/tpcds/q71.sql               |   59 -
 .../src/test/resources/tpcds/q72.sql               |   48 -
 .../src/test/resources/tpcds/q73.sql               |   45 -
 .../src/test/resources/tpcds/q74.sql               |   73 -
 .../src/test/resources/tpcds/q75.sql               |   91 -
 .../src/test/resources/tpcds/q76.sql               |   62 -
 .../src/test/resources/tpcds/q77.sql               |  115 -
 .../src/test/resources/tpcds/q78.sql               |   79 -
 .../src/test/resources/tpcds/q79.sql               |   42 -
 .../spark-security/src/test/resources/tpcds/q8.sql |  102 -
 .../src/test/resources/tpcds/q80.sql               |  109 -
 .../src/test/resources/tpcds/q81.sql               |   53 -
 .../src/test/resources/tpcds/q82.sql               |   30 -
 .../src/test/resources/tpcds/q83.sql               |   71 -
 .../src/test/resources/tpcds/q84.sql               |   34 -
 .../src/test/resources/tpcds/q85.sql               |   97 -
 .../src/test/resources/tpcds/q86.sql               |   39 -
 .../src/test/resources/tpcds/q87.sql               |   43 -
 .../src/test/resources/tpcds/q88.sql               |  137 -
 .../src/test/resources/tpcds/q89.sql               |   45 -
 .../spark-security/src/test/resources/tpcds/q9.sql |   63 -
 .../src/test/resources/tpcds/q90.sql               |   34 -
 .../src/test/resources/tpcds/q91.sql               |   38 -
 .../src/test/resources/tpcds/q92.sql               |   31 -
 .../src/test/resources/tpcds/q93.sql               |   34 -
 .../src/test/resources/tpcds/q94.sql               |   38 -
 .../src/test/resources/tpcds/q95.sql               |   44 -
 .../src/test/resources/tpcds/q96.sql               |   26 -
 .../src/test/resources/tpcds/q97.sql               |   45 -
 .../src/test/resources/tpcds/q98.sql               |   36 -
 .../src/test/resources/tpcds/q99.sql               |   49 -
 .../org/apache/spark/sql/SubmarineSparkUtils.scala |   60 -
 .../SubmarineConfigurationCheckExtensionTest.scala |   55 -
 .../SubmarineDataMaskingExtensionTest.scala        |   50 -
 ...marinePushPredicatesThroughExtensionsTest.scala |   49 -
 .../SubmarineRowFilterExtensionTest.scala          |   51 -
 .../optimizer/SubmarineSparkOptimizerTest.scala    |   50 -
 ...rineSparkRangerAuthorizationExtensionTest.scala |   64 -
 .../spark/security/AuthorizationTest.scala         |  231 --
 .../spark/security/DataMaskingSQLTest.scala        |  295 ---
 .../spark/security/RowFilterSQLTest.scala          |  274 --
 .../submarine/spark/security/TPCDSTest.scala       |  385 ---
 .../security/parser/SubmarineSqlParserTest.scala   |   69 -
 website/docs/devDocs/README.md                     |   12 +-
 .../submarine-security/spark-security/README.md    |  135 -
 .../build-submarine-spark-security-plugin.md       |   32 -
 website/sidebars.js                                |    6 -
 188 files changed, 5 insertions(+), 16144 deletions(-)

diff --git a/.github/workflows/master.yml b/.github/workflows/master.yml
index b8ea487..b37d342 100644
--- a/.github/workflows/master.yml
+++ b/.github/workflows/master.yml
@@ -371,40 +371,7 @@ jobs:
       run: |
         echo ">>> mvn $TEST_FLAG $TEST_MODULES $PROFILE -B"
         mvn $TEST_FLAG $TEST_MODULES $PROFILE -B
-  submarine-security:
-    runs-on: ubuntu-latest
-    timeout-minutes: 30
-    strategy:
-      matrix:
-        spark-version: ["2.3", "2.4", "3.0"]
-        range-version: ["1.2", "2.0", "2.1"]
-        exclude:
-        - spark-version: "2.3"
-          range-version: "1.2"
-    steps:
-    - uses: actions/checkout@v2
-      with:
-        fetch-depth: 50
-    - name: Set up JDK 1.8
-      uses: actions/setup-java@v1
-      with:
-        java-version: "1.8"
-    - name: Set up Maven 3.6.3
-      uses: stCarolas/setup-maven@v4
-      with:
-        maven-version: 3.6.3
-    - name: Check version
-      run: |
-        mvn --version
-        java -version
-    - name: Build and Test
-      env:
-        BUILD_FLAG: "clean install -Dmaven.javadoc.skip=true -ntp"
-        MODULES: "-pl :submarine-spark-security"
-        PROFILE: "-Pspark-${{matrix.spark-version}} -Pranger-${{matrix.range-version}}"
-      run: |
-        echo ">>> mvn $BUILD_FLAG $MODULES $PROFILE -B"
-        mvn $BUILD_FLAG $MODULES $PROFILE -B
+
   rat:
     name: Check License
     runs-on: ubuntu-latest
@@ -451,7 +418,6 @@ jobs:
     needs:
       - submarine-e2e
       - submarine-k8s
-      - submarine-security
       - submarine-submitter
       - submarine-server
       - submarine-client
@@ -536,7 +502,6 @@ jobs:
         env:
           SONAR_TOKEN: ${{ secrets.SONARCLOUD_TOKEN }}
           GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
-          EXCLUDE_MODULE: '!:submarine-spark-security' # Exclude the modules that can't be compiled with JDK 11
       - name: Delete temporary build artifacts before caching
         run: |
           #Never cache local artifacts
diff --git a/.gitignore b/.gitignore
index 7e1f577..60e35b5 100644
--- a/.gitignore
+++ b/.gitignore
@@ -82,9 +82,6 @@ submarine-cloud/hack/conf/*
 submarine-cloud/hack/output/*
 submarine-cloud/bin/*
 
-submarine-security/spark-security/dependency-reduced-pom.xml
-submarine-security/spark-security/derby.log
-
 # submarine-cloud-v2
 submarine-cloud-v2/vendor/*
 submarine-cloud-v2/submarine-operator
diff --git a/pom.xml b/pom.xml
index 42a2425..e97bf46 100644
--- a/pom.xml
+++ b/pom.xml
@@ -129,9 +129,6 @@
     <zeppelin.version>0.9.0-preview1</zeppelin.version>
     <jgit.version>5.5.1.201910021850-r</jgit.version>
     <atomix.version>3.1.5</atomix.version>
-    <spark.scala.version>2.11.8</spark.scala.version>
-    <spark.scala.binary.version>2.11</spark.scala.binary.version>
-    <hive.version>2.3.6</hive.version>
     <!--  Submarine on Kubernetes  -->
     <k8s.client-java.version>6.0.1</k8s.client-java.version>
     <jersey.test-framework>2.27</jersey.test-framework>
@@ -153,7 +150,6 @@
     <module>submarine-workbench</module>
     <module>submarine-dist</module>
     <module>submarine-test</module>
-    <module>submarine-security/spark-security</module>
   </modules>
 
   <dependencyManagement>
diff --git a/submarine-security/spark-security/pom.xml b/submarine-security/spark-security/pom.xml
deleted file mode 100644
index 539e6dc..0000000
--- a/submarine-security/spark-security/pom.xml
+++ /dev/null
@@ -1,642 +0,0 @@
-<?xml version="1.0" encoding="UTF-8"?>
-<!--
-  Licensed to the Apache Software Foundation (ASF) under one
-  or more contributor license agreements.  See the NOTICE file
-  distributed with this work for additional information
-  regarding copyright ownership.  The ASF licenses this file
-  to you under the Apache License, Version 2.0 (the
-  "License"); you may not use this file except in compliance
-  with the License.  You may obtain a copy of the License at
-
-    http://www.apache.org/licenses/LICENSE-2.0
-
-  Unless required by applicable law or agreed to in writing,
-  software distributed under the License is distributed on an
-  "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-  KIND, either express or implied.  See the License for the
-  specific language governing permissions and limitations
-  under the License.
-  -->
-
-<project xmlns="http://maven.apache.org/POM/4.0.0"
-         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
-         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
-  <parent>
-    <artifactId>submarine</artifactId>
-    <groupId>org.apache.submarine</groupId>
-    <version>0.7.0-SNAPSHOT</version>
-    <relativePath>../../pom.xml</relativePath>
-  </parent>
-  <modelVersion>4.0.0</modelVersion>
-  <packaging>jar</packaging>
-
-  <name>Submarine: Spark Security</name>
-  <artifactId>submarine-spark-security</artifactId>
-
-  <properties>
-    <antlr4.version>4.7</antlr4.version>
-    <eclipse.jpa.version>2.5.2</eclipse.jpa.version>
-    <elasticsearch.version>7.10.2</elasticsearch.version>
-    <gson.version>2.2.4</gson.version>
-    <httpcomponents.httpclient.version>4.5.3</httpcomponents.httpclient.version>
-    <httpcomponents.httpcore.version>4.4.6</httpcomponents.httpcore.version>
-    <httpcomponents.httpmime.version>4.5.3</httpcomponents.httpmime.version>
-    <javax.persistence.version>2.1.0</javax.persistence.version>
-    <jersey-bundle.version>1.19.3</jersey-bundle.version>
-    <noggit.version>0.6</noggit.version>
-    <ranger.spark.package>submarine_spark_ranger_project</ranger.spark.package>
-    <ranger.version>1.2.0</ranger.version>
-    <ranger.major.version>1</ranger.major.version>
-    <spark.compatible.version>2</spark.compatible.version>
-    <scala.version>2.11.8</scala.version>
-    <scala.binary.version>2.11</scala.binary.version>
-    <scalatest.version>2.2.6</scalatest.version>
-    <solr.version>8.4.0</solr.version>
-    <spark.version>2.4.7</spark.version>
-    <spark.scope>provided</spark.scope>
-    <gethostname4j.version>0.0.2</gethostname4j.version>
-    <gethostname4j.scope>test</gethostname4j.scope>
-    <jna.version>5.2.0</jna.version>
-    <jna-platform.version>5.2.0</jna-platform.version>
-    <jna.scope>test</jna.scope>
-    <codehaus.jackson.version>1.9.13</codehaus.jackson.version>
-  </properties>
-  <dependencies>
-    <dependency>
-      <groupId>org.apache.commons</groupId>
-      <artifactId>commons-lang3</artifactId>
-      <scope>test</scope>
-    </dependency>
-    <dependency>
-      <groupId>org.scala-lang</groupId>
-      <artifactId>scala-library</artifactId>
-      <version>${scala.version}</version>
-      <scope>provided</scope>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.spark</groupId>
-      <artifactId>spark-catalyst_${scala.binary.version}</artifactId>
-      <version>${spark.version}</version>
-      <scope>${spark.scope}</scope>
-      <exclusions>
-        <exclusion>
-          <groupId>org.spark-project.spark</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-        <exclusion>
-          <groupId>org.scala-lang.modules</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.spark</groupId>
-      <artifactId>spark-hive_${scala.binary.version}</artifactId>
-      <version>${spark.version}</version>
-      <scope>${spark.scope}</scope>
-      <exclusions>
-        <exclusion>
-          <groupId>org.spark-project.spark</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.ranger</groupId>
-      <artifactId>ranger-plugins-common</artifactId>
-      <version>${ranger.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.ranger</groupId>
-      <artifactId>ranger-plugins-cred</artifactId>
-      <version>${ranger.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.ranger</groupId>
-      <artifactId>ranger-plugins-audit</artifactId>
-      <version>${ranger.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.hive</groupId>
-      <artifactId>hive-exec</artifactId>
-      <version>2.3.4</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.eclipse.persistence</groupId>
-      <artifactId>eclipselink</artifactId>
-      <version>${eclipse.jpa.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.elasticsearch</groupId>
-      <artifactId>elasticsearch</artifactId>
-      <version>${elasticsearch.version}</version>
-    </dependency>
-    <dependency>
-      <groupId>com.google.code.gson</groupId>
-      <artifactId>gson</artifactId>
-      <version>${gson.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.eclipse.persistence</groupId>
-      <artifactId>javax.persistence</artifactId>
-      <version>${javax.persistence.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.httpcomponents</groupId>
-      <artifactId>httpcore</artifactId>
-      <version>${httpcomponents.httpcore.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.httpcomponents</groupId>
-      <artifactId>httpmime</artifactId>
-      <version>${httpcomponents.httpmime.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.httpcomponents</groupId>
-      <artifactId>httpclient</artifactId>
-      <version>${httpcomponents.httpclient.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>com.sun.jersey</groupId>
-      <artifactId>jersey-bundle</artifactId>
-      <version>${jersey-bundle.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.noggit</groupId>
-      <artifactId>noggit</artifactId>
-      <version>${noggit.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-    <dependency>
-      <groupId>org.apache.solr</groupId>
-      <artifactId>solr-solrj</artifactId>
-      <version>${solr.version}</version>
-      <exclusions>
-        <exclusion>
-          <groupId>*</groupId>
-          <artifactId>*</artifactId>
-        </exclusion>
-      </exclusions>
-    </dependency>
-
-    <dependency>
-      <groupId>org.antlr</groupId>
-      <artifactId>antlr4-runtime</artifactId>
-      <version>${antlr4.version}</version>
-    </dependency>
-
-    <!-- unit tests-->
-    <dependency>
-      <groupId>org.scalatest</groupId>
-      <artifactId>scalatest_${scala.binary.version}</artifactId>
-      <version>3.0.3</version>
-    </dependency>
-
-    <dependency>
-      <groupId>org.apache.spark</groupId>
-      <artifactId>spark-core_${scala.binary.version}</artifactId>
-      <version>${spark.version}</version>
-      <type>test-jar</type>
-    </dependency>
-
-    <dependency>
-      <groupId>org.apache.spark</groupId>
-      <artifactId>spark-hive_${scala.binary.version}</artifactId>
-      <version>${spark.version}</version>
-      <type>test-jar</type>
-    </dependency>
-
-    <dependency>
-      <groupId>com.kstruct</groupId>
-      <artifactId>gethostname4j</artifactId>
-      <version>${gethostname4j.version}</version>
-      <scope>${gethostname4j.scope}</scope>
-    </dependency>
-
-    <dependency>
-      <groupId>net.java.dev.jna</groupId>
-      <artifactId>jna</artifactId>
-      <version>${jna.version}</version>
-      <scope>${jna.scope}</scope>
-    </dependency>
-
-    <dependency>
-      <groupId>net.java.dev.jna</groupId>
-      <artifactId>jna-platform</artifactId>
-      <version>${jna-platform.version}</version>
-      <scope>${jna.scope}</scope>
-    </dependency>
-
-    <dependency>
-      <groupId>org.codehaus.jackson</groupId>
-      <artifactId>jackson-jaxrs</artifactId>
-      <version>${codehaus.jackson.version}</version>
-      <scope>compile</scope>
-    </dependency>
-
-  </dependencies>
-
-  <build>
-    <outputDirectory>target/scala-${scala.binary.version}/classes</outputDirectory>
-    <testOutputDirectory>target/scala-${scala.binary.version}/test-classes</testOutputDirectory>
-    <testResources>
-      <testResource>
-        <directory>${project.basedir}/src/test/resources</directory>
-      </testResource>
-    </testResources>
-    <plugins>
-      <plugin>
-        <groupId>org.codehaus.mojo</groupId>
-        <artifactId>build-helper-maven-plugin</artifactId>
-        <executions>
-          <execution>
-            <id>add-source</id>
-            <phase>generate-sources</phase>
-            <goals>
-              <goal>add-source</goal>
-            </goals>
-            <configuration>
-              <sources>
-                <source>ranger-${ranger.major.version}/src/main/scala</source>
-                <source>target/generated-sources/antlr4</source>
-              </sources>
-            </configuration>
-          </execution>
-
-          <execution>
-            <id>add-spark-source</id>
-            <phase>generate-sources</phase>
-            <goals>
-              <goal>add-source</goal>
-            </goals>
-            <configuration>
-              <sources>
-                <source>spark-${spark.compatible.version}/src/main/scala</source>
-              </sources>
-            </configuration>
-          </execution>
-          <execution>
-            <id>add-test-source</id>
-            <phase>generate-test-sources</phase>
-            <goals>
-              <goal>add-test-source</goal>
-            </goals>
-            <configuration>
-              <sources>
-                <source>ranger-${ranger.major.version}/src/test/scala</source>
-              </sources>
-            </configuration>
-          </execution>
-        </executions>
-      </plugin>
-      <plugin>
-        <groupId>net.alchim31.maven</groupId>
-        <artifactId>scala-maven-plugin</artifactId>
-        <version>3.2.2</version>
-        <executions>
-          <execution>
-            <id>eclipse-add-source</id>
-            <goals>
-              <goal>add-source</goal>
-            </goals>
-          </execution>
-          <execution>
-            <id>scala-compile-first</id>
-            <goals>
-              <goal>compile</goal>
-            </goals>
-          </execution>
-          <execution>
-            <id>scala-test-compile-first</id>
-            <goals>
-              <goal>testCompile</goal>
-            </goals>
-          </execution>
-        </executions>
-        <configuration>
-          <scalaVersion>${scala.version}</scalaVersion>
-          <recompileMode>incremental</recompileMode>
-          <useZincServer>true</useZincServer>
-          <args>
-            <arg>-unchecked</arg>
-            <arg>-deprecation</arg>
-            <arg>-feature</arg>
-            <arg>-explaintypes</arg>
-            <arg>-Yno-adapted-args</arg>
-          </args>
-          <jvmArgs>
-            <jvmArg>-Xms1024m</jvmArg>
-            <jvmArg>-Xmx1024m</jvmArg>
-            <jvmArg>-XX:ReservedCodeCacheSize=512M</jvmArg>
-          </jvmArgs>
-          <javacArgs>
-            <javacArg>-source</javacArg>
-            <javacArg>${java.version}</javacArg>
-            <javacArg>-target</javacArg>
-            <javacArg>${java.version}</javacArg>
-            <javacArg>-Xlint:all,-serial,-path,-try</javacArg>
-          </javacArgs>
-        </configuration>
-      </plugin>
-
-      <plugin>
-        <groupId>org.apache.maven.plugins</groupId>
-        <artifactId>maven-shade-plugin</artifactId>
-        <version>${plugin.shade.version}</version>
-        <configuration>
-          <shadedArtifactAttached>false</shadedArtifactAttached>
-          <artifactSet>
-            <includes>
-              <include>com.google.code.gson:gson</include>
-              <include>com.sun.jersey:jersey-bundle</include>
-              <include>com.kstruct:gethostname4j</include>
-              <include>net.java.dev.jna:jna</include>
-              <include>net.java.dev.jna:jna-platform</include>
-              <include>org.apache.httpcomponents:httpclient</include>
-              <include>org.apache.httpcomponents:httpcore</include>
-              <include>org.apache.httpcomponents:httpmime</include>
-              <include>org.apache.ranger:ranger-plugins-common</include>
-              <include>org.apache.ranger:ranger-plugins-cred</include>
-              <include>org.apache.ranger:ranger-plugins-audit</include>
-              <include>org.apache.solr:solr-solrj</include>
-              <include>org.codehaus.jackson:jackson-core-asl</include>
-              <include>org.codehaus.jackson:jackson-jaxrs</include>
-              <include>org.codehaus.jackson:jackson-mapper-asl</include>
-              <include>org.codehaus.jackson:jackson-xc</include>
-              <include>org.eclipse.persistence:eclipselink</include>
-              <include>org.eclipse.persistence:javax.persistence</include>
-              <include>org.apache.hive:hive-exec</include>
-              <include>org.noggit:noggit</include>
-            </includes>
-          </artifactSet>
-          <filters>
-            <filter>
-              <artifact>org.apache.hive:hive-exec</artifact>
-              <includes>
-                <!-- Extract masking functions from higher version Apache Hive-->
-                <include>org/apache/hadoop/hive/ql/udf/generic/**Mask**</include>
-                <include>org/apache/hadoop/hive/ql/udf/generic/**Transform**</include>
-              </includes>
-            </filter>
-          </filters>
-          <relocations>
-            <relocation>
-              <pattern>com.sun.jersey</pattern>
-              <shadedPattern>${ranger.spark.package}.com.sun.jersey</shadedPattern>
-            </relocation>
-            <relocation>
-              <pattern>com.sun.research</pattern>
-              <shadedPattern>${ranger.spark.package}.com.sun.research</shadedPattern>
-            </relocation>
-            <relocation>
-              <pattern>com.sun.ws</pattern>
-              <shadedPattern>${ranger.spark.package}.com.sun.ws</shadedPattern>
-            </relocation>
-            <relocation>
-              <pattern>jersey.repackaged</pattern>
-              <shadedPattern>${ranger.spark.package}.jersey.repackaged</shadedPattern>
-            </relocation>
-            <relocation>
-              <pattern>javax.ws.rs</pattern>
-              <shadedPattern>${ranger.spark.package}.javax.ws.rs</shadedPattern>
-            </relocation>
-            <relocation>
-              <pattern>org.codehaus</pattern>
-              <shadedPattern>${ranger.spark.package}.org.codehaus</shadedPattern>
-            </relocation>
-          </relocations>
-        </configuration>
-        <executions>
-          <execution>
-            <phase>package</phase>
-            <goals>
-              <goal>shade</goal>
-            </goals>
-            <configuration>
-              <transformers>
-                <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
-              </transformers>
-            </configuration>
-          </execution>
-        </executions>
-      </plugin>
-
-      <plugin>
-        <groupId>org.apache.maven.plugins</groupId>
-        <artifactId>maven-enforcer-plugin</artifactId>
-        <configuration>
-          <skip>true</skip>
-        </configuration>
-      </plugin>
-
-      <!-- disable surefire -->
-      <plugin>
-        <groupId>org.apache.maven.plugins</groupId>
-        <artifactId>maven-surefire-plugin</artifactId>
-        <configuration>
-          <skipTests>true</skipTests>
-        </configuration>
-      </plugin>
-      <!-- enable scalatest -->
-      <plugin>
-        <groupId>org.scalatest</groupId>
-        <artifactId>scalatest-maven-plugin</artifactId>
-        <version>1.0</version>
-        <configuration>
-          <reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
-          <junitxml>.</junitxml>
-          <filereports>TestSuite.txt</filereports>
-        </configuration>
-        <executions>
-          <execution>
-            <id>test</id>
-            <goals>
-              <goal>test</goal>
-            </goals>
-          </execution>
-        </executions>
-      </plugin>
-
-      <plugin>
-        <groupId>org.antlr</groupId>
-        <artifactId>antlr4-maven-plugin</artifactId>
-        <version>${antlr4.version}</version>
-        <executions>
-          <execution>
-            <goals>
-              <goal>antlr4</goal>
-            </goals>
-          </execution>
-        </executions>
-        <configuration>
-          <visitor>true</visitor>
-          <sourceDirectory>./src/main/antlr4</sourceDirectory>
-          <treatWarningsAsErrors>true</treatWarningsAsErrors>
-        </configuration>
-      </plugin>
-
-      <plugin>
-        <groupId>org.jacoco</groupId>
-        <artifactId>jacoco-maven-plugin</artifactId>
-        <version>0.8.0</version>
-        <configuration>
-        </configuration>
-        <executions>
-          <execution>
-            <id>pre-test</id>
-            <goals>
-              <goal>prepare-agent</goal>
-            </goals>
-          </execution>
-          <execution>
-            <id>report</id>
-            <phase>test</phase>
-            <goals>
-              <goal>report</goal>
-            </goals>
-          </execution>
-        </executions>
-      </plugin>
-    </plugins>
-  </build>
-
-  <profiles>
-    <profile>
-      <id>spark-2.4</id>
-      <properties>
-        <spark.version>2.4.7</spark.version>
-        <scalatest.version>3.0.3</scalatest.version>
-      </properties>
-    </profile>
-
-    <profile>
-      <id>spark-3.0</id>
-      <properties>
-        <spark.version>3.0.2</spark.version>
-        <scala.version>2.12.10</scala.version>
-        <scala.binary.version>2.12</scala.binary.version>
-        <!--<scalatest.version>3.2.0</scalatest.version>-->
-        <spark.compatible.version>3</spark.compatible.version>
-        <commons-lang3.version>3.9</commons-lang3.version>
-        <jackson-databind.version>2.10.5</jackson-databind.version>
-        <jackson-annotations.version>2.10.5</jackson-annotations.version>
-      </properties>
-    </profile>
-
-    <profile>
-      <id>ranger-1.2</id>
-      <properties>
-        <eclipse.jpa.version>2.5.2</eclipse.jpa.version>
-        <gson.version>2.2.4</gson.version>
-        <httpcomponents.httpclient.version>4.5.3</httpcomponents.httpclient.version>
-        <httpcomponents.httpcore.version>4.4.1</httpcomponents.httpcore.version>
-        <httpcomponents.httpmime.version>4.5.3</httpcomponents.httpmime.version>
-        <javax.persistence.version>2.1.0</javax.persistence.version>
-        <jersey-bundle.version>1.19.3</jersey-bundle.version>
-        <noggit.version>0.6</noggit.version>
-        <ranger.version>1.2.0</ranger.version>
-        <solr.version>5.5.4</solr.version>
-      </properties>
-    </profile>
-
-    <profile>
-      <id>ranger-2.0</id>
-      <properties>
-        <httpcomponents.httpclient.version>4.5.3</httpcomponents.httpclient.version>
-        <httpcomponents.httpcore.version>4.4.6</httpcomponents.httpcore.version>
-        <httpcomponents.httpmime.version>4.5.3</httpcomponents.httpmime.version>
-        <ranger.version>2.0.0</ranger.version>
-        <ranger.major.version>2.0</ranger.major.version>
-        <solr.version>7.7.1</solr.version>
-        <gethostname4j.scope>compile</gethostname4j.scope>
-        <jna.scope>compile</jna.scope>
-      </properties>
-    </profile>
-    <profile>
-      <id>ranger-2.1</id>
-      <properties>
-        <httpcomponents.httpclient.version>4.5.3</httpcomponents.httpclient.version>
-        <httpcomponents.httpcore.version>4.4.6</httpcomponents.httpcore.version>
-        <httpcomponents.httpmime.version>4.5.3</httpcomponents.httpmime.version>
-        <ranger.version>2.1.0</ranger.version>
-        <ranger.major.version>2.1</ranger.major.version>
-        <solr.version>7.7.1</solr.version>
-        <elasticsearch.version>7.10.2</elasticsearch.version>
-        <gethostname4j.scope>compile</gethostname4j.scope>
-        <jna.scope>compile</jna.scope>
-      </properties>
-    </profile>
-
-  </profiles>
-</project>
diff --git a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/CreateRoleCommand.scala b/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/CreateRoleCommand.scala
deleted file mode 100644
index 3efa931..0000000
--- a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/CreateRoleCommand.scala
+++ /dev/null
@@ -1,29 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.execution.command.RunnableCommand
-
-case class CreateRoleCommand(roleName: String) extends RunnableCommand {
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    throw new UnsupportedOperationException("CREATE ROLE")
-  }
-}
diff --git a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/DropRoleCommand.scala b/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/DropRoleCommand.scala
deleted file mode 100644
index e8fc57c..0000000
--- a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/DropRoleCommand.scala
+++ /dev/null
@@ -1,29 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.execution.command.RunnableCommand
-
-case class DropRoleCommand (roleName: String) extends RunnableCommand {
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    throw new UnsupportedOperationException("DROP ROLE")
-  }
-}
diff --git a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/ShowCurrentRolesCommand.scala b/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/ShowCurrentRolesCommand.scala
deleted file mode 100644
index b736af5..0000000
--- a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/ShowCurrentRolesCommand.scala
+++ /dev/null
@@ -1,30 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.execution.command.RunnableCommand
-
-case class ShowCurrentRolesCommand() extends RunnableCommand {
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    throw new UnsupportedOperationException("SHOW CURRENT ROLES")
-  }
-}
-
diff --git a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/ShowRolesCommand.scala b/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/ShowRolesCommand.scala
deleted file mode 100644
index ab4fc10..0000000
--- a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security.command/ShowRolesCommand.scala
+++ /dev/null
@@ -1,29 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.execution.command.RunnableCommand
-
-case class ShowRolesCommand () extends RunnableCommand {
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    throw new UnsupportedOperationException("SHOW ROLES")
-  }
-}
diff --git a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security/RangerSparkPlugin.scala b/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security/RangerSparkPlugin.scala
deleted file mode 100644
index 3e34ab7..0000000
--- a/submarine-security/spark-security/ranger-1/src/main/scala/org.apache.submarine.spark.security/RangerSparkPlugin.scala
+++ /dev/null
@@ -1,59 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import java.io.{File, IOException}
-
-import org.apache.commons.logging.LogFactory
-import org.apache.ranger.authorization.hadoop.config.RangerConfiguration
-import org.apache.ranger.plugin.service.RangerBasePlugin
-
-object RangerSparkPlugin extends RangerBasePlugin("spark", "sparkSql") {
-
-  private val LOG = LogFactory.getLog(RangerSparkPlugin.getClass)
-
-  private val rangerConf: RangerConfiguration = RangerConfiguration.getInstance
-  val showColumnsOption: String = rangerConf.get(
-    "xasecure.spark.describetable.showcolumns.authorization.option", "NONE")
-
-  lazy val fsScheme: Array[String] = RangerConfiguration.getInstance()
-    .get("ranger.plugin.spark.urlauth.filesystem.schemes", "hdfs:,file:")
-    .split(",")
-    .map(_.trim)
-
-  override def init(): Unit = {
-    super.init()
-    val cacheDir = new File(rangerConf.get("ranger.plugin.spark.policy.cache.dir"))
-    if (cacheDir.exists() &&
-      (!cacheDir.isDirectory || !cacheDir.canRead || !cacheDir.canWrite)) {
-      throw new IOException("Policy cache directory already exists at" +
-        cacheDir.getAbsolutePath + ", but it is unavailable")
-    }
-
-    if (!cacheDir.exists() && !cacheDir.mkdirs()) {
-      throw new IOException("Unable to create ranger policy cache directory at" +
-        cacheDir.getAbsolutePath)
-    }
-    LOG.info("Policy cache directory successfully set to " + cacheDir.getAbsolutePath)
-  }
-
-  init()
-}
-
diff --git a/submarine-security/spark-security/ranger-1/src/test/scala/org.apache.submarine.spark.security/RangerAdminClientImpl.scala b/submarine-security/spark-security/ranger-1/src/test/scala/org.apache.submarine.spark.security/RangerAdminClientImpl.scala
deleted file mode 100644
index 0723e84..0000000
--- a/submarine-security/spark-security/ranger-1/src/test/scala/org.apache.submarine.spark.security/RangerAdminClientImpl.scala
+++ /dev/null
@@ -1,58 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import java.nio.file.{Files, FileSystems}
-import java.util
-
-import com.google.gson.GsonBuilder
-import org.apache.commons.logging.{Log, LogFactory}
-import org.apache.ranger.admin.client.RangerAdminRESTClient
-import org.apache.ranger.plugin.util.{GrantRevokeRequest, ServicePolicies, ServiceTags}
-
-class RangerAdminClientImpl extends RangerAdminRESTClient {
-  private val LOG: Log = LogFactory.getLog(classOf[RangerAdminClientImpl])
-  private val cacheFilename = "sparkSql_hive_jenkins.json"
-  private val gson =
-    new GsonBuilder().setDateFormat("yyyyMMdd-HH:mm:ss.SSS-Z").setPrettyPrinting().create
-  private var policies: ServicePolicies = _
-
-  override def init(serviceName: String, appId: String, configPropertyPrefix: String): Unit = {
-    if (policies == null) {
-      val basedir = this.getClass.getProtectionDomain.getCodeSource.getLocation.getPath
-      val cachePath = FileSystems.getDefault.getPath(basedir, cacheFilename)
-      LOG.info("Reading policies from " + cachePath)
-      val bytes = Files.readAllBytes(cachePath)
-      policies = gson.fromJson(new String(bytes), classOf[ServicePolicies])
-    }
-  }
-
-  override def getServicePoliciesIfUpdated(lastKnownVersion: Long, lastActivationTimeInMillis: Long): ServicePolicies = {
-    policies
-  }
-
-  override def grantAccess(request: GrantRevokeRequest): Unit = {}
-
-  override def revokeAccess(request: GrantRevokeRequest): Unit = {}
-
-  override def getServiceTagsIfUpdated(lastKnownVersion: Long, lastActivationTimeInMillis: Long): ServiceTags = null
-
-  override def getTagTypes(tagTypePattern: String): util.List[String] = null
-}
diff --git a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/CommandUtils.scala b/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/CommandUtils.scala
deleted file mode 100644
index 3657553..0000000
--- a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/CommandUtils.scala
+++ /dev/null
@@ -1,33 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-private[command] object CommandUtils {
-
-  final val RESERVED_ROLE_NAMES = Set("ALL", "DEFAULT", "NONE")
-
-  def validateRoleName(roleName: String): Unit = {
-    if (RESERVED_ROLE_NAMES.exists(roleName.equalsIgnoreCase)) {
-      throw new IllegalArgumentException(s"Role name cannot be one of the reserved roles: " +
-        s"${RESERVED_ROLE_NAMES.mkString(",")}")
-    }
-  }
-
-}
diff --git a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/CreateRoleCommand.scala b/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/CreateRoleCommand.scala
deleted file mode 100644
index ae394b7..0000000
--- a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/CreateRoleCommand.scala
+++ /dev/null
@@ -1,59 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import java.util.Arrays
-
-import scala.util.control.NonFatal
-
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.ranger.plugin.model.RangerRole
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.execution.command.RunnableCommand
-
-import org.apache.submarine.spark.security.{RangerSparkAuditHandler, RangerSparkPlugin, SparkAccessControlException}
-
-
-case class CreateRoleCommand(roleName: String) extends RunnableCommand {
-  import CommandUtils._
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-
-    validateRoleName(roleName)
-    val auditHandler = RangerSparkAuditHandler()
-    val currentUser = UserGroupInformation.getCurrentUser.getShortUserName
-
-    val role = new RangerRole()
-    role.setName(roleName)
-    role.setCreatedByUser(currentUser)
-    role.setCreatedBy(currentUser)
-    role.setUpdatedBy(currentUser)
-    val member = new RangerRole.RoleMember(currentUser, true)
-    role.setUsers(Arrays.asList(member))
-    try {
-      val res = RangerSparkPlugin.createRole(role, auditHandler)
-      logDebug(s"Create role: ${res.getName} success")
-      Seq.empty[Row]
-    } catch {
-      case NonFatal(e) => throw new SparkAccessControlException(e.getMessage, e)
-    } finally {
-      // TODO: support auditHandler.flushAudit()
-    }
-  }
-}
diff --git a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/DropRoleCommand.scala b/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/DropRoleCommand.scala
deleted file mode 100644
index 230bdb5..0000000
--- a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/DropRoleCommand.scala
+++ /dev/null
@@ -1,45 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import scala.util.control.NonFatal
-
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.execution.command.RunnableCommand
-
-import org.apache.submarine.spark.security.{RangerSparkAuditHandler, RangerSparkPlugin, SparkAccessControlException}
-
-case class DropRoleCommand(roleName: String) extends RunnableCommand {
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    CommandUtils.validateRoleName(roleName)
-
-    try {
-      val auditHandler = RangerSparkAuditHandler()
-      val currentUser = UserGroupInformation.getCurrentUser.getShortUserName
-      RangerSparkPlugin.dropRole(currentUser, roleName, auditHandler)
-      Seq.empty[Row]
-    } catch {
-      case NonFatal(e) => throw new SparkAccessControlException(e.getMessage, e)
-    } finally {
-      // TODO: support auditHandler.flushAudit()
-    }
-  }
-}
diff --git a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/ShowCurrentRolesCommand.scala b/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/ShowCurrentRolesCommand.scala
deleted file mode 100644
index 7739483..0000000
--- a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/ShowCurrentRolesCommand.scala
+++ /dev/null
@@ -1,51 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import scala.collection.JavaConverters._
-import scala.util.control.NonFatal
-
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference}
-import org.apache.spark.sql.execution.command.RunnableCommand
-import org.apache.spark.sql.types.StringType
-
-import org.apache.submarine.spark.security.{RangerSparkAuditHandler, RangerSparkPlugin, SparkAccessControlException}
-
-case class ShowCurrentRolesCommand() extends RunnableCommand {
-
-  override def output: Seq[Attribute] =
-    Seq(AttributeReference("Role Name", StringType, nullable = false)())
-
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-
-    try {
-      val auditHandler = RangerSparkAuditHandler()
-      val currentUser = UserGroupInformation.getCurrentUser.getShortUserName
-      val roles = RangerSparkPlugin.getUserRoles(currentUser, auditHandler)
-      roles.asScala.map(Row(_))
-    } catch {
-      case NonFatal(e) => throw new SparkAccessControlException(e.getMessage, e)
-    } finally {
-      // TODO: support auditHandler.flushAudit()
-    }
-  }
-}
diff --git a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/ShowRolesCommand.scala b/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/ShowRolesCommand.scala
deleted file mode 100644
index 73d0ab1..0000000
--- a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security.command/ShowRolesCommand.scala
+++ /dev/null
@@ -1,50 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import scala.collection.JavaConverters._
-import scala.util.control.NonFatal
-
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference}
-import org.apache.spark.sql.execution.command.RunnableCommand
-import org.apache.spark.sql.types.StringType
-
-import org.apache.submarine.spark.security.{RangerSparkAuditHandler, RangerSparkPlugin, SparkAccessControlException}
-
-case class ShowRolesCommand () extends RunnableCommand {
-
-  override def output: Seq[Attribute] =
-    Seq(AttributeReference("Role Name", StringType, nullable = false)())
-
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    try {
-      val auditHandler = RangerSparkAuditHandler()
-      val currentUser = UserGroupInformation.getCurrentUser.getShortUserName
-      val roles = RangerSparkPlugin.getAllRoles(currentUser, auditHandler)
-      roles.asScala.map(Row(_))
-    } catch {
-      case NonFatal(e) => throw new SparkAccessControlException(e.getMessage, e)
-    } finally {
-      // TODO: support auditHandler.flushAudit()
-    }
-  }
-}
diff --git a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security/RangerSparkPlugin.scala b/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security/RangerSparkPlugin.scala
deleted file mode 100644
index 3e34ab7..0000000
--- a/submarine-security/spark-security/ranger-2.0/src/main/scala/org.apache.submarine.spark.security/RangerSparkPlugin.scala
+++ /dev/null
@@ -1,59 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import java.io.{File, IOException}
-
-import org.apache.commons.logging.LogFactory
-import org.apache.ranger.authorization.hadoop.config.RangerConfiguration
-import org.apache.ranger.plugin.service.RangerBasePlugin
-
-object RangerSparkPlugin extends RangerBasePlugin("spark", "sparkSql") {
-
-  private val LOG = LogFactory.getLog(RangerSparkPlugin.getClass)
-
-  private val rangerConf: RangerConfiguration = RangerConfiguration.getInstance
-  val showColumnsOption: String = rangerConf.get(
-    "xasecure.spark.describetable.showcolumns.authorization.option", "NONE")
-
-  lazy val fsScheme: Array[String] = RangerConfiguration.getInstance()
-    .get("ranger.plugin.spark.urlauth.filesystem.schemes", "hdfs:,file:")
-    .split(",")
-    .map(_.trim)
-
-  override def init(): Unit = {
-    super.init()
-    val cacheDir = new File(rangerConf.get("ranger.plugin.spark.policy.cache.dir"))
-    if (cacheDir.exists() &&
-      (!cacheDir.isDirectory || !cacheDir.canRead || !cacheDir.canWrite)) {
-      throw new IOException("Policy cache directory already exists at" +
-        cacheDir.getAbsolutePath + ", but it is unavailable")
-    }
-
-    if (!cacheDir.exists() && !cacheDir.mkdirs()) {
-      throw new IOException("Unable to create ranger policy cache directory at" +
-        cacheDir.getAbsolutePath)
-    }
-    LOG.info("Policy cache directory successfully set to " + cacheDir.getAbsolutePath)
-  }
-
-  init()
-}
-
diff --git a/submarine-security/spark-security/ranger-2.0/src/test/scala/org.apache.submarine.spark.security/RangerAdminClientImpl.scala b/submarine-security/spark-security/ranger-2.0/src/test/scala/org.apache.submarine.spark.security/RangerAdminClientImpl.scala
deleted file mode 100644
index 0723e84..0000000
--- a/submarine-security/spark-security/ranger-2.0/src/test/scala/org.apache.submarine.spark.security/RangerAdminClientImpl.scala
+++ /dev/null
@@ -1,58 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import java.nio.file.{Files, FileSystems}
-import java.util
-
-import com.google.gson.GsonBuilder
-import org.apache.commons.logging.{Log, LogFactory}
-import org.apache.ranger.admin.client.RangerAdminRESTClient
-import org.apache.ranger.plugin.util.{GrantRevokeRequest, ServicePolicies, ServiceTags}
-
-class RangerAdminClientImpl extends RangerAdminRESTClient {
-  private val LOG: Log = LogFactory.getLog(classOf[RangerAdminClientImpl])
-  private val cacheFilename = "sparkSql_hive_jenkins.json"
-  private val gson =
-    new GsonBuilder().setDateFormat("yyyyMMdd-HH:mm:ss.SSS-Z").setPrettyPrinting().create
-  private var policies: ServicePolicies = _
-
-  override def init(serviceName: String, appId: String, configPropertyPrefix: String): Unit = {
-    if (policies == null) {
-      val basedir = this.getClass.getProtectionDomain.getCodeSource.getLocation.getPath
-      val cachePath = FileSystems.getDefault.getPath(basedir, cacheFilename)
-      LOG.info("Reading policies from " + cachePath)
-      val bytes = Files.readAllBytes(cachePath)
-      policies = gson.fromJson(new String(bytes), classOf[ServicePolicies])
-    }
-  }
-
-  override def getServicePoliciesIfUpdated(lastKnownVersion: Long, lastActivationTimeInMillis: Long): ServicePolicies = {
-    policies
-  }
-
-  override def grantAccess(request: GrantRevokeRequest): Unit = {}
-
-  override def revokeAccess(request: GrantRevokeRequest): Unit = {}
-
-  override def getServiceTagsIfUpdated(lastKnownVersion: Long, lastActivationTimeInMillis: Long): ServiceTags = null
-
-  override def getTagTypes(tagTypePattern: String): util.List[String] = null
-}
diff --git a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/CommandUtils.scala b/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/CommandUtils.scala
deleted file mode 100644
index 3657553..0000000
--- a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/CommandUtils.scala
+++ /dev/null
@@ -1,33 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-private[command] object CommandUtils {
-
-  final val RESERVED_ROLE_NAMES = Set("ALL", "DEFAULT", "NONE")
-
-  def validateRoleName(roleName: String): Unit = {
-    if (RESERVED_ROLE_NAMES.exists(roleName.equalsIgnoreCase)) {
-      throw new IllegalArgumentException(s"Role name cannot be one of the reserved roles: " +
-        s"${RESERVED_ROLE_NAMES.mkString(",")}")
-    }
-  }
-
-}
diff --git a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/CreateRoleCommand.scala b/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/CreateRoleCommand.scala
deleted file mode 100644
index ae394b7..0000000
--- a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/CreateRoleCommand.scala
+++ /dev/null
@@ -1,59 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import java.util.Arrays
-
-import scala.util.control.NonFatal
-
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.ranger.plugin.model.RangerRole
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.execution.command.RunnableCommand
-
-import org.apache.submarine.spark.security.{RangerSparkAuditHandler, RangerSparkPlugin, SparkAccessControlException}
-
-
-case class CreateRoleCommand(roleName: String) extends RunnableCommand {
-  import CommandUtils._
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-
-    validateRoleName(roleName)
-    val auditHandler = RangerSparkAuditHandler()
-    val currentUser = UserGroupInformation.getCurrentUser.getShortUserName
-
-    val role = new RangerRole()
-    role.setName(roleName)
-    role.setCreatedByUser(currentUser)
-    role.setCreatedBy(currentUser)
-    role.setUpdatedBy(currentUser)
-    val member = new RangerRole.RoleMember(currentUser, true)
-    role.setUsers(Arrays.asList(member))
-    try {
-      val res = RangerSparkPlugin.createRole(role, auditHandler)
-      logDebug(s"Create role: ${res.getName} success")
-      Seq.empty[Row]
-    } catch {
-      case NonFatal(e) => throw new SparkAccessControlException(e.getMessage, e)
-    } finally {
-      // TODO: support auditHandler.flushAudit()
-    }
-  }
-}
diff --git a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/DropRoleCommand.scala b/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/DropRoleCommand.scala
deleted file mode 100644
index 230bdb5..0000000
--- a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/DropRoleCommand.scala
+++ /dev/null
@@ -1,45 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import scala.util.control.NonFatal
-
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.execution.command.RunnableCommand
-
-import org.apache.submarine.spark.security.{RangerSparkAuditHandler, RangerSparkPlugin, SparkAccessControlException}
-
-case class DropRoleCommand(roleName: String) extends RunnableCommand {
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    CommandUtils.validateRoleName(roleName)
-
-    try {
-      val auditHandler = RangerSparkAuditHandler()
-      val currentUser = UserGroupInformation.getCurrentUser.getShortUserName
-      RangerSparkPlugin.dropRole(currentUser, roleName, auditHandler)
-      Seq.empty[Row]
-    } catch {
-      case NonFatal(e) => throw new SparkAccessControlException(e.getMessage, e)
-    } finally {
-      // TODO: support auditHandler.flushAudit()
-    }
-  }
-}
diff --git a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/ShowCurrentRolesCommand.scala b/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/ShowCurrentRolesCommand.scala
deleted file mode 100644
index 7739483..0000000
--- a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/ShowCurrentRolesCommand.scala
+++ /dev/null
@@ -1,51 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import scala.collection.JavaConverters._
-import scala.util.control.NonFatal
-
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference}
-import org.apache.spark.sql.execution.command.RunnableCommand
-import org.apache.spark.sql.types.StringType
-
-import org.apache.submarine.spark.security.{RangerSparkAuditHandler, RangerSparkPlugin, SparkAccessControlException}
-
-case class ShowCurrentRolesCommand() extends RunnableCommand {
-
-  override def output: Seq[Attribute] =
-    Seq(AttributeReference("Role Name", StringType, nullable = false)())
-
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-
-    try {
-      val auditHandler = RangerSparkAuditHandler()
-      val currentUser = UserGroupInformation.getCurrentUser.getShortUserName
-      val roles = RangerSparkPlugin.getUserRoles(currentUser, auditHandler)
-      roles.asScala.map(Row(_))
-    } catch {
-      case NonFatal(e) => throw new SparkAccessControlException(e.getMessage, e)
-    } finally {
-      // TODO: support auditHandler.flushAudit()
-    }
-  }
-}
diff --git a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/ShowRolesCommand.scala b/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/ShowRolesCommand.scala
deleted file mode 100644
index 73d0ab1..0000000
--- a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security.command/ShowRolesCommand.scala
+++ /dev/null
@@ -1,50 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.command
-
-import scala.collection.JavaConverters._
-import scala.util.control.NonFatal
-
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.catalyst.expressions.{Attribute, AttributeReference}
-import org.apache.spark.sql.execution.command.RunnableCommand
-import org.apache.spark.sql.types.StringType
-
-import org.apache.submarine.spark.security.{RangerSparkAuditHandler, RangerSparkPlugin, SparkAccessControlException}
-
-case class ShowRolesCommand () extends RunnableCommand {
-
-  override def output: Seq[Attribute] =
-    Seq(AttributeReference("Role Name", StringType, nullable = false)())
-
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    try {
-      val auditHandler = RangerSparkAuditHandler()
-      val currentUser = UserGroupInformation.getCurrentUser.getShortUserName
-      val roles = RangerSparkPlugin.getAllRoles(currentUser, auditHandler)
-      roles.asScala.map(Row(_))
-    } catch {
-      case NonFatal(e) => throw new SparkAccessControlException(e.getMessage, e)
-    } finally {
-      // TODO: support auditHandler.flushAudit()
-    }
-  }
-}
diff --git a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security/RangerSparkPlugin.scala b/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security/RangerSparkPlugin.scala
deleted file mode 100644
index e16007c..0000000
--- a/submarine-security/spark-security/ranger-2.1/src/main/scala/org.apache.submarine.spark.security/RangerSparkPlugin.scala
+++ /dev/null
@@ -1,57 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import java.io.{File, IOException}
-import org.apache.commons.logging.LogFactory
-import org.apache.ranger.authorization.hadoop.config.RangerPluginConfig
-import org.apache.ranger.plugin.service.RangerBasePlugin
-
-object RangerSparkPlugin extends RangerBasePlugin("spark", "sparkSql") {
-
-  private val LOG = LogFactory.getLog(RangerSparkPlugin.getClass)
-
-  private val rangerConf: RangerPluginConfig = this.getConfig
-  val showColumnsOption: String = rangerConf.get(
-    "xasecure.spark.describetable.showcolumns.authorization.option", "NONE")
-
-  lazy val fsScheme: Array[String] = rangerConf
-    .get("ranger.plugin.spark.urlauth.filesystem.schemes", "hdfs:,file:")
-    .split(",")
-    .map(_.trim)
-
-  override def init(): Unit = {
-    super.init()
-    val cacheDir = new File(rangerConf.get("ranger.plugin.spark.policy.cache.dir"))
-    if (cacheDir.exists() &&
-      (!cacheDir.isDirectory || !cacheDir.canRead || !cacheDir.canWrite)) {
-      throw new IOException("Policy cache directory already exists at" +
-        cacheDir.getAbsolutePath + ", but it is unavailable")
-    }
-
-    if (!cacheDir.exists() && !cacheDir.mkdirs()) {
-      throw new IOException("Unable to create ranger policy cache directory at" +
-        cacheDir.getAbsolutePath)
-    }
-    LOG.info("Policy cache directory successfully set to " + cacheDir.getAbsolutePath)
-  }
-
-  init()
-}
diff --git a/submarine-security/spark-security/ranger-2.1/src/test/scala/org.apache.submarine.spark.security/RangerAdminClientImpl.scala b/submarine-security/spark-security/ranger-2.1/src/test/scala/org.apache.submarine.spark.security/RangerAdminClientImpl.scala
deleted file mode 100644
index ee442f0..0000000
--- a/submarine-security/spark-security/ranger-2.1/src/test/scala/org.apache.submarine.spark.security/RangerAdminClientImpl.scala
+++ /dev/null
@@ -1,77 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import java.nio.file.{FileSystems, Files}
-import java.util
-import com.google.gson.GsonBuilder
-import org.apache.commons.logging.{Log, LogFactory}
-import org.apache.hadoop.conf.Configuration
-import org.apache.ranger.admin.client.RangerAdminClient
-import org.apache.ranger.plugin.model.RangerRole
-import org.apache.ranger.plugin.util.{GrantRevokeRequest, GrantRevokeRoleRequest, RangerRoles, RangerUserStore, ServicePolicies, ServiceTags}
-
-class RangerAdminClientImpl extends RangerAdminClient {
-  private val LOG: Log = LogFactory.getLog(classOf[RangerAdminClientImpl])
-  private val cacheFilename = "sparkSql_hive_jenkins.json"
-  private val gson =
-    new GsonBuilder().setDateFormat("yyyyMMdd-HH:mm:ss.SSS-Z").setPrettyPrinting().create
-  private var policies: ServicePolicies = _
-
-  override def init(serviceName: String, appId: String, configPropertyPrefix: String, var4: Configuration): Unit = {
-    if (policies == null) {
-      val basedir = this.getClass.getProtectionDomain.getCodeSource.getLocation.getPath
-      val cachePath = FileSystems.getDefault.getPath(basedir, cacheFilename)
-      LOG.info("Reading policies from " + cachePath)
-      val bytes = Files.readAllBytes(cachePath)
-      policies = gson.fromJson(new String(bytes), classOf[ServicePolicies])
-    }
-  }
-
-  override def getServicePoliciesIfUpdated(lastKnownVersion: Long, lastActivationTimeInMillis: Long): ServicePolicies = {
-    policies
-  }
-
-  override def grantAccess(request: GrantRevokeRequest): Unit = {}
-
-  override def revokeAccess(request: GrantRevokeRequest): Unit = {}
-
-  override def getServiceTagsIfUpdated(lastKnownVersion: Long, lastActivationTimeInMillis: Long): ServiceTags = null
-
-  override def getTagTypes(tagTypePattern: String): util.List[String] = null
-
-  override def getRolesIfUpdated(l: Long, l1: Long): RangerRoles = null
-
-  override def createRole(rangerRole: RangerRole): RangerRole = null
-
-  override def dropRole(s: String, s1: String): Unit = {}
-
-  override def getAllRoles(s: String): util.List[String] = null
-
-  override def getUserRoles(s: String): util.List[String] = null
-
-  override def getRole(s: String, s1: String): RangerRole = null
-
-  override def grantRole(grantRevokeRoleRequest: GrantRevokeRoleRequest): Unit = {}
-
-  override def revokeRole(grantRevokeRoleRequest: GrantRevokeRoleRequest): Unit = {}
-
-  override def getUserStoreIfUpdated(l: Long, l1: Long): RangerUserStore = null
-}
diff --git a/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/compatible/CompatibleFunc.scala b/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/compatible/CompatibleFunc.scala
deleted file mode 100644
index 4b2e6dd..0000000
--- a/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/compatible/CompatibleFunc.scala
+++ /dev/null
@@ -1,34 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.compatible
-
-import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
-import org.apache.spark.sql.execution.command.{AnalyzeColumnCommand, SetDatabaseCommand, ShowDatabasesCommand}
-
-object CompatibleFunc {
-
-  def getPattern(child: ShowDatabasesCommand) = child.databasePattern
-
-  def getCatLogName(s: SetDatabaseCommand) = s.databaseName
-
-  def analyzeColumnName(column: AnalyzeColumnCommand) = column.columnNames
-
-  def tableIdentifier(u: UnresolvedRelation) = u.tableIdentifier
-}
diff --git a/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/compatible/SubqueryCompatible.scala b/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/compatible/SubqueryCompatible.scala
deleted file mode 100644
index 7c0847f..0000000
--- a/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/compatible/SubqueryCompatible.scala
+++ /dev/null
@@ -1,34 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.compatible
-
-import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, Subquery}
-
-case class SubqueryCompatible(child: LogicalPlan, correlated: Boolean= false) {
-  Subquery(child)
-}
-
-object SubqueryCompatible {
-  // def apply(child: LogicalPlan, correlated: Boolean= false) = Subquery(child)
-  def unapply(subquery: Subquery): Option[LogicalPlan] = Subquery.unapply(subquery)
-}
-
-
-
diff --git a/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/compatible/command/CompatibleCommand.scala b/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/compatible/command/CompatibleCommand.scala
deleted file mode 100644
index 84325c3..0000000
--- a/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/compatible/command/CompatibleCommand.scala
+++ /dev/null
@@ -1,33 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.compatible
-
-import org.apache.spark.sql.execution.command.{PersistedView, SetDatabaseCommand, ShowDatabasesCommand}
-
-package object CompatibleCommand {
-
-  type ShowDatabasesCommandCompatible = ShowDatabasesCommand
-  type SetDatabaseCommandCompatible = SetDatabaseCommand
-
-}
-
-object PersistedViewCompatible {
-  val obj: PersistedView.type = PersistedView
-}
diff --git a/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/security/parser/SubmarineSqlParser.scala b/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/security/parser/SubmarineSqlParser.scala
deleted file mode 100644
index dc8c43e..0000000
--- a/submarine-security/spark-security/spark-2/src/main/scala/org/apache/submarine/spark/security/parser/SubmarineSqlParser.scala
+++ /dev/null
@@ -1,107 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.parser
-
-import org.antlr.v4.runtime.{CharStreams, CommonTokenStream}
-import org.antlr.v4.runtime.atn.PredictionMode
-import org.antlr.v4.runtime.misc.ParseCancellationException
-import org.apache.spark.sql.AnalysisException
-import org.apache.spark.sql.catalyst.{FunctionIdentifier, TableIdentifier}
-import org.apache.spark.sql.catalyst.expressions.Expression
-import org.apache.spark.sql.catalyst.parser.{ParseErrorListener, ParseException, ParserInterface, PostProcessor}
-import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
-import org.apache.spark.sql.catalyst.trees.Origin
-import org.apache.spark.sql.types.{DataType, StructType}
-
-class SubmarineSqlParser(val delegate: ParserInterface) extends ParserInterface {
-
-  private val astBuilder = new SubmarineSqlAstBuilder
-
-  override def parsePlan(sqlText: String): LogicalPlan = parse(sqlText) { parser =>
-    astBuilder.visit(parser.singleStatement()) match {
-      case plan: LogicalPlan => plan
-      case _ => delegate.parsePlan(sqlText)
-    }
-  }
-
-  // scalastyle:off line.size.limit
-  /**
-    * Fork from `org.apache.spark.sql.catalyst.parser.AbstractSqlParser#parse(java.lang.String, scala.Function1)`.
-    *
-    * @see https://github.com/apache/spark/blob/v2.4.4/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ParseDriver.scala#L81
-    */
-  // scalastyle:on
-  private def parse[T](command: String)(toResult: SubmarineSqlBaseParser => T): T = {
-    val lexer = new SubmarineSqlBaseLexer(new UpperCaseCharStream(CharStreams.fromString(command)))
-    lexer.removeErrorListeners()
-    lexer.addErrorListener(ParseErrorListener)
-
-    val tokenStream = new CommonTokenStream(lexer)
-    val parser = new SubmarineSqlBaseParser(tokenStream)
-    parser.addParseListener(PostProcessor)
-    parser.removeErrorListeners()
-    parser.addErrorListener(ParseErrorListener)
-
-    try {
-      try {
-        // first, try parsing with potentially faster SLL mode
-        parser.getInterpreter.setPredictionMode(PredictionMode.SLL)
-        toResult(parser)
-      } catch {
-        case e: ParseCancellationException =>
-          // if we fail, parse with LL mode
-          tokenStream.seek(0) // rewind input stream
-          parser.reset()
-
-          // Try Again.
-          parser.getInterpreter.setPredictionMode(PredictionMode.LL)
-          toResult(parser)
-      }
-    } catch {
-      case e: ParseException if e.command.isDefined =>
-        throw e
-      case e: ParseException =>
-        throw e.withCommand(command)
-      case e: AnalysisException =>
-        val position = Origin(e.line, e.startPosition)
-        throw new ParseException(Option(command), e.message, position, position)
-    }
-  }
-
-  override def parseExpression(sqlText: String): Expression = {
-    delegate.parseExpression(sqlText)
-  }
-
-  override def parseTableIdentifier(sqlText: String): TableIdentifier = {
-    delegate.parseTableIdentifier(sqlText)
-  }
-
-  override def parseFunctionIdentifier(sqlText: String): FunctionIdentifier = {
-    delegate.parseFunctionIdentifier(sqlText)
-  }
-
-  override def parseTableSchema(sqlText: String): StructType = {
-    delegate.parseTableSchema(sqlText)
-  }
-
-  override def parseDataType(sqlText: String): DataType = {
-    delegate.parseDataType(sqlText)
-  }
-}
diff --git a/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/compatible/CompatibleFunc.scala b/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/compatible/CompatibleFunc.scala
deleted file mode 100644
index 5c50ced..0000000
--- a/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/compatible/CompatibleFunc.scala
+++ /dev/null
@@ -1,36 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.compatible
-
-import org.apache.spark.sql.catalyst.TableIdentifier
-import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
-import org.apache.spark.sql.catalyst.plans.logical.{SetCatalogAndNamespace, ShowNamespaces}
-import org.apache.spark.sql.execution.command.AnalyzeColumnCommand
-
-object CompatibleFunc {
-
-  def getPattern(child: ShowNamespaces) = child.pattern
-
-  def getCatLogName(s: SetCatalogAndNamespace) = s.catalogName
-
-  def analyzeColumnName(column: AnalyzeColumnCommand) = column.columnNames.get
-
-  def tableIdentifier(u: UnresolvedRelation) = TableIdentifier.apply(u.tableName)
-}
diff --git a/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/compatible/SubqueryCompatible.scala b/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/compatible/SubqueryCompatible.scala
deleted file mode 100644
index 63b9695..0000000
--- a/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/compatible/SubqueryCompatible.scala
+++ /dev/null
@@ -1,29 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.compatible
-
-import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, Subquery}
-
-object SubqueryCompatible {
-  def apply(child: LogicalPlan, correlated: Boolean) = Subquery(child, correlated)
-  def unapply(subquery: Subquery): Option[(LogicalPlan, Boolean)] = Subquery.unapply(subquery)
-}
-
-
diff --git a/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/compatible/command/CompatibleCommand.scala b/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/compatible/command/CompatibleCommand.scala
deleted file mode 100644
index 8bd3311..0000000
--- a/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/compatible/command/CompatibleCommand.scala
+++ /dev/null
@@ -1,36 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.compatible
-
-import org.apache.spark.sql.catalyst.analysis.PersistedView
-import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, SetCatalogAndNamespace, ShowNamespaces, Subquery}
-
-
-package object CompatibleCommand {
-
-  type ShowDatabasesCommandCompatible = ShowNamespaces
-  type SetDatabaseCommandCompatible = SetCatalogAndNamespace
-}
-
-object PersistedViewCompatible {
-  val obj: PersistedView.type = PersistedView
-}
-
-
diff --git a/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/security/parser/SubmarineSqlParser.scala b/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/security/parser/SubmarineSqlParser.scala
deleted file mode 100644
index c6e11d0..0000000
--- a/submarine-security/spark-security/spark-3/src/main/scala/org/apache/submarine/spark/security/parser/SubmarineSqlParser.scala
+++ /dev/null
@@ -1,116 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.parser
-
-import org.antlr.v4.runtime.{CharStreams, CommonTokenStream}
-import org.antlr.v4.runtime.atn.PredictionMode
-import org.antlr.v4.runtime.misc.ParseCancellationException
-import org.apache.spark.sql.AnalysisException
-import org.apache.spark.sql.catalyst.{FunctionIdentifier, TableIdentifier}
-import org.apache.spark.sql.catalyst.expressions.Expression
-import org.apache.spark.sql.catalyst.parser.{ParseErrorListener, ParseException, ParserInterface, PostProcessor}
-import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
-import org.apache.spark.sql.catalyst.trees.Origin
-import org.apache.spark.sql.types.{DataType, StructType}
-
-class SubmarineSqlParser(val delegate: ParserInterface) extends ParserInterface {
-
-  private val astBuilder = new SubmarineSqlAstBuilder
-
-  override def parsePlan(sqlText: String): LogicalPlan = parse(sqlText) { parser =>
-    astBuilder.visit(parser.singleStatement()) match {
-      case plan: LogicalPlan => plan
-      case _ => delegate.parsePlan(sqlText)
-    }
-  }
-
-  // scalastyle:off line.size.limit
-  /**
-    * Fork from `org.apache.spark.sql.catalyst.parser.AbstractSqlParser#parse(java.lang.String, scala.Function1)`.
-    *
-    * @see https://github.com/apache/spark/blob/v2.4.4/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ParseDriver.scala#L81
-    */
-  // scalastyle:on
-  private def parse[T](command: String)(toResult: SubmarineSqlBaseParser => T): T = {
-    val lexer = new SubmarineSqlBaseLexer(new UpperCaseCharStream(CharStreams.fromString(command)))
-    lexer.removeErrorListeners()
-    lexer.addErrorListener(ParseErrorListener)
-
-    val tokenStream = new CommonTokenStream(lexer)
-    val parser = new SubmarineSqlBaseParser(tokenStream)
-    parser.addParseListener(PostProcessor)
-    parser.removeErrorListeners()
-    parser.addErrorListener(ParseErrorListener)
-
-    try {
-      try {
-        // first, try parsing with potentially faster SLL mode
-        parser.getInterpreter.setPredictionMode(PredictionMode.SLL)
-        toResult(parser)
-      } catch {
-        case e: ParseCancellationException =>
-          // if we fail, parse with LL mode
-          tokenStream.seek(0) // rewind input stream
-          parser.reset()
-
-          // Try Again.
-          parser.getInterpreter.setPredictionMode(PredictionMode.LL)
-          toResult(parser)
-      }
-    } catch {
-      case e: ParseException if e.command.isDefined =>
-        throw e
-      case e: ParseException =>
-        throw e.withCommand(command)
-      case e: AnalysisException =>
-        val position = Origin(e.line, e.startPosition)
-        throw new ParseException(Option(command), e.message, position, position)
-    }
-  }
-
-  override def parseExpression(sqlText: String): Expression = {
-    delegate.parseExpression(sqlText)
-  }
-
-  override def parseTableIdentifier(sqlText: String): TableIdentifier = {
-    delegate.parseTableIdentifier(sqlText)
-  }
-
-  override def parseFunctionIdentifier(sqlText: String): FunctionIdentifier = {
-    delegate.parseFunctionIdentifier(sqlText)
-  }
-
-  override def parseTableSchema(sqlText: String): StructType = {
-    delegate.parseTableSchema(sqlText)
-  }
-
-  override def parseDataType(sqlText: String): DataType = {
-    delegate.parseDataType(sqlText)
-  }
-
-  override def parseMultipartIdentifier(sqlText: String): Seq[String] = {
-    delegate.parseMultipartIdentifier(sqlText)
-  }
-
-  override def parseRawDataType(sqlText: String): DataType = {
-    delegate.parseRawDataType(sqlText)
-  }
-
-}
diff --git a/submarine-security/spark-security/src/main/antlr4/org/apache/submarine/spark/security/parser/SubmarineSqlBase.g4 b/submarine-security/spark-security/src/main/antlr4/org/apache/submarine/spark/security/parser/SubmarineSqlBase.g4
deleted file mode 100644
index 927ae4d..0000000
--- a/submarine-security/spark-security/src/main/antlr4/org/apache/submarine/spark/security/parser/SubmarineSqlBase.g4
+++ /dev/null
@@ -1,114 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-/*
- * The idea an part of the original code is adopted from Apache Spark project
- * We should obey the same Apache License 2.0 too.
- */
-
-grammar SubmarineSqlBase;
-
-singleStatement
-    : statement EOF
-    ;
-
-statement
-    : CREATE ROLE identifier                                           #createRole
-    | DROP ROLE identifier                                             #dropRole
-    | SHOW CURRENT ROLES                                               #showCurrentRoles
-    | SHOW ROLES                                                       #showRoles
-    ;
-
-identifier
-    : IDENTIFIER                                                       #unquotedIdentifier
-    | quotedIdentifier                                                 #quotedIdentifierAlternative
-    | nonReserved                                                      #unquotedIdentifier
-    ;
-
-quotedIdentifier
-    : BACKQUOTED_IDENTIFIER
-    ;
-
-nonReserved
-    : ALL
-    | ALTER
-    | CREATE
-    | CURRENT
-    | DELETE
-    | DELETE
-    | DROP
-    | INSERT
-    | PRIVILEGES
-    | READ
-    | ROLE
-    | ROLES
-    | SELECT
-    | SHOW
-    | UPDATE
-    | USE
-    | WRITE
-    ;
-
-//============================
-// Start of the keywords list
-//============================
-ALL: 'ALL';
-ALTER: 'ALTER';
-CREATE: 'CREATE';
-CURRENT: 'CURRENT';
-DELETE: 'DELETE';
-DROP: 'DROP';
-GRANT: 'GRANT';
-INSERT: 'INSERT';
-PRIVILEGES: 'PRIVILEGES';
-READ: 'READ';
-ROLE: 'ROLE';
-ROLES: 'ROLES';
-SELECT: 'SELECT';
-SHOW: 'SHOW';
-UPDATE: 'UPDATE';
-USE: 'USE';
-WRITE: 'WRITE';
-
-
-BACKQUOTED_IDENTIFIER
-    : '`' ( ~'`' | '``' )* '`'
-    ;
-
-IDENTIFIER
-    : (LETTER | DIGIT | '_')+
-    ;
-
-fragment DIGIT
-    : [0-9]
-    ;
-
-fragment LETTER
-    : [A-Z]
-    ;
-
-WS  : [ \r\n\t]+ -> channel(HIDDEN)
-    ;
-
-// Catch-all for anything we can't recognize.
-// We use this to be able to ignore and recover all the text
-// when splitting statements with DelimiterLexer
-UNRECOGNIZED
-    : .
-    ;
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/AuthzUtils.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/AuthzUtils.scala
deleted file mode 100644
index 55fda27..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/AuthzUtils.scala
+++ /dev/null
@@ -1,47 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql
-
-import scala.util.{Failure, Success, Try}
-
-private[sql] object AuthzUtils {
-
-  def getFieldVal(o: Any, name: String): Any = {
-    Try {
-      val field = o.getClass.getDeclaredField(name)
-      field.setAccessible(true)
-      field.get(o)
-    } match {
-      case Success(value) => value
-      case Failure(exception) => throw exception
-    }
-  }
-
-  def setFieldVal(o: Any, name: String, value: Any): Unit = {
-    Try {
-      val field = o.getClass.getDeclaredField(name)
-      field.setAccessible(true)
-      field.set(o, value.asInstanceOf[AnyRef])
-    } match {
-      case Failure(exception) => throw exception
-      case _ =>
-    }
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineConfigurationCheckExtension.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineConfigurationCheckExtension.scala
deleted file mode 100644
index 44e94ec..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineConfigurationCheckExtension.scala
+++ /dev/null
@@ -1,50 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.catalyst.optimizer
-
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
-import org.apache.spark.sql.execution.command.SetCommand
-
-import org.apache.submarine.spark.security.SparkAccessControlException
-
-/**
- * For banning end-users from set restricted spark configurations
- */
-case class SubmarineConfigurationCheckExtension(spark: SparkSession)
-  extends (LogicalPlan => Unit) {
-
-  final val RESTRICT_LIST_KEY = "spark.sql.submarine.conf.restricted.list"
-
-  private val bannedList: Seq[String] =
-    RESTRICT_LIST_KEY ::
-      "spark.sql.runSQLOnFiles" ::
-      "spark.sql.extensions" ::
-      spark.conf.getOption(RESTRICT_LIST_KEY).map(_.split(',').toList).getOrElse(Nil)
-
-  override def apply(plan: LogicalPlan): Unit = plan match {
-    case SetCommand(Some(("spark.sql.optimizer.excludedRules", Some(v))))
-        if v.contains("Submarine") =>
-      throw new SparkAccessControlException("Excluding Submarine security rules is not allowed")
-    case SetCommand(Some((k, Some(_)))) if bannedList.contains(k) =>
-      throw new SparkAccessControlException(s"Modifying $k is not allowed")
-    case _ =>
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineDataMaskingExtension.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineDataMaskingExtension.scala
deleted file mode 100644
index 7941930..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineDataMaskingExtension.scala
+++ /dev/null
@@ -1,268 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.catalyst.optimizer
-
-import scala.collection.mutable
-
-import org.apache.commons.lang3.StringUtils
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.ranger.plugin.policyengine.RangerAccessResult
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.catalyst.{FunctionIdentifier, TableIdentifier}
-import org.apache.spark.sql.catalyst.analysis.UnresolvedAttribute
-import org.apache.spark.sql.catalyst.catalog.{CatalogFunction, CatalogTable, HiveTableRelation}
-import org.apache.spark.sql.catalyst.expressions.{Alias, Attribute, AttributeReference, Expression, ExprId, NamedExpression, SubqueryExpression}
-import org.apache.spark.sql.catalyst.plans.logical._
-import org.apache.spark.sql.catalyst.rules.Rule
-import org.apache.spark.sql.execution.command.{CreateDataSourceTableAsSelectCommand, CreateViewCommand, InsertIntoDataSourceDirCommand}
-import org.apache.spark.sql.execution.datasources.{InsertIntoDataSourceCommand, InsertIntoHadoopFsRelationCommand, LogicalRelation, SaveIntoDataSourceCommand}
-import org.apache.spark.sql.hive.execution.{CreateHiveTableAsSelectCommand, InsertIntoHiveDirCommand, InsertIntoHiveTable}
-
-import org.apache.submarine.spark.compatible.SubqueryCompatible
-import org.apache.submarine.spark.security._
-import org.apache.submarine.spark.security.SparkObjectType.COLUMN
-
-
-/**
- * An Apache Spark's [[Optimizer]] extension for column data masking.
- * TODO(kent yao) implement this as analyzer rule
- */
-case class SubmarineDataMaskingExtension(spark: SparkSession) extends Rule[LogicalPlan] {
-  import org.apache.ranger.plugin.model.RangerPolicy._
-
-  // register all built-in masking udfs
-  Map("mask" -> "org.apache.hadoop.hive.ql.udf.generic.GenericUDFMask",
-    "mask_first_n" -> "org.apache.hadoop.hive.ql.udf.generic.GenericUDFMaskFirstN",
-    "mask_hash" -> "org.apache.hadoop.hive.ql.udf.generic.GenericUDFMaskHash",
-    "mask_last_n" -> "org.apache.hadoop.hive.ql.udf.generic.GenericUDFMaskLastN",
-    "mask_show_first_n" -> "org.apache.hadoop.hive.ql.udf.generic.GenericUDFMaskShowFirstN",
-    "mask_show_last_n" -> "org.apache.hadoop.hive.ql.udf.generic.GenericUDFMaskShowLastN")
-    .map(x => CatalogFunction(FunctionIdentifier(x._1), x._2, Seq.empty))
-    .foreach(spark.sessionState.catalog.registerFunction(_, overrideIfExists = true))
-
-  private lazy val sqlParser = spark.sessionState.sqlParser
-  private lazy val analyzer = spark.sessionState.analyzer
-  private lazy val auditHandler = RangerSparkAuditHandler()
-  private def currentUser: UserGroupInformation = UserGroupInformation.getCurrentUser
-
-  /**
-   * Get RangerAccessResult from ranger admin or local policies, which contains data masking rules
-   */
-  private def getAccessResult(identifier: TableIdentifier, attr: Attribute): RangerAccessResult = {
-    val resource = RangerSparkResource(COLUMN, identifier.database, identifier.table, attr.name)
-    val req = new RangerSparkAccessRequest(
-      resource,
-      currentUser.getShortUserName,
-      currentUser.getGroupNames.toSet,
-      COLUMN.toString,
-      SparkAccessType.SELECT,
-      RangerSparkPlugin.getClusterName)
-    RangerSparkPlugin.evalDataMaskPolicies(req, auditHandler)
-  }
-
-  /**
-   * Generate an [[Alias]] expression with the access result and original expression, which can be
-   * used to replace the original output of the query.
-   *
-   * This alias contains a child, which might be null literal or [[UnresolvedFunction]]. When the
-   * child is function, it replace the argument which is [[UnresolvedAttribute]] with the input
-   * attribute to resolve directly.
-   */
-  private def getMasker(attr: Attribute, result: RangerAccessResult): Alias = {
-    val expr = if (StringUtils.equalsIgnoreCase(result.getMaskType, MASK_TYPE_NULL)) {
-      "NULL"
-    } else if (StringUtils.equalsIgnoreCase(result.getMaskType, MASK_TYPE_CUSTOM)) {
-      val maskVal = result.getMaskedValue
-      if (maskVal == null) {
-        "NULL"
-      } else {
-        s"${maskVal.replace("{col}", attr.name)}"
-      }
-    } else if (result.getMaskTypeDef != null) {
-      val transformer = result.getMaskTypeDef.getTransformer
-      if (StringUtils.isNotEmpty(transformer)) {
-        s"${transformer.replace("{col}", attr.name)}"
-      } else {
-        return null
-      }
-    } else {
-      return null
-    }
-
-    // sql expression text -> UnresolvedFunction
-    val parsed = sqlParser.parseExpression(expr)
-
-    // Here we replace the attribute with the resolved one, e.g.
-    // 'mask_show_last_n('value, 4, x, x, x, -1, 1)
-    // ->
-    // 'mask_show_last_n(value#37, 4, x, x, x, -1, 1)
-    val resolved = parsed mapChildren {
-      case _: UnresolvedAttribute => attr
-      case o => o
-    }
-    Alias(resolved, attr.name)()
-  }
-
-  /**
-   * Collecting transformers from Ranger data masking policies, and mapping the to the
-   * [[LogicalPlan]] output attributes.
-   *
-   * @param plan the original logical plan with a underlying catalog table
-   * @param table the catalog table
-   * @return a list of key-value pairs of original expression with its masking representation
-   */
-  private def collectTransformers(
-      plan: LogicalPlan,
-      table: CatalogTable,
-      aliases: mutable.Map[Alias, ExprId],
-      outputs: Seq[NamedExpression]): Map[ExprId, NamedExpression] = {
-    try {
-      val maskEnableResults = plan.output.map { expr =>
-        expr -> getAccessResult(table.identifier, expr)
-      }.filter(x => isMaskEnabled(x._2))
-
-      val formedMaskers = maskEnableResults.map { case (expr, result) =>
-        expr.exprId -> getMasker(expr, result)
-      }.filter(_._2 != null).toMap
-
-      val aliasedMaskers = new mutable.HashMap[ExprId, Alias]()
-
-      for (output <- outputs) {
-        val newOutput = output transformUp {
-          case ar: AttributeReference => formedMaskers.getOrElse(ar.exprId, ar)
-        }
-
-        if (!output.equals(newOutput)) {
-          val newAlias = Alias(newOutput, output.name)()
-          aliasedMaskers.put(output.exprId, newAlias)
-        }
-      }
-
-      for ((alias, id) <- aliases if formedMaskers.contains(id)) {
-        val originalAlias = formedMaskers(id)
-        val newChild = originalAlias.child mapChildren {
-          case ar: AttributeReference =>
-            ar.copy(name = alias.name)(alias.exprId, alias.qualifier)
-          case o => o
-        }
-        val newAlias = Alias(newChild, alias.name)()
-        aliasedMaskers.put(alias.exprId, newAlias)
-      }
-
-      formedMaskers ++ aliasedMaskers
-    } catch {
-      case e: Exception => throw e
-    }
-  }
-
-  private def isMaskEnabled(result: RangerAccessResult): Boolean = {
-    result != null && result.isMaskEnabled
-  }
-
-  private def hasCatalogTable(plan: LogicalPlan): Boolean = plan match {
-    case _: HiveTableRelation => true
-    case l: LogicalRelation if l.catalogTable.isDefined => true
-    case _ => false
-  }
-
-  private def collectAllAliases(plan: LogicalPlan): mutable.HashMap[Alias, ExprId] = {
-    val aliases = new mutable.HashMap[Alias, ExprId]()
-    plan.transformAllExpressions {
-      case a: Alias =>
-        a.child transformUp {
-          case ne: NamedExpression =>
-            aliases.getOrElseUpdate(a, ne.exprId)
-            ne
-        }
-        a
-    }
-    aliases
-  }
-
-  private def collectAllTransformers(
-      plan: LogicalPlan,
-      aliases: mutable.Map[Alias, ExprId]): Map[ExprId, NamedExpression] = {
-    val outputs = plan match {
-      case p: Project => p.projectList
-      case o => o.output
-    }
-
-    plan.collectLeaves().flatMap {
-      case h: HiveTableRelation =>
-        collectTransformers(h, h.tableMeta, aliases, outputs)
-      case l: LogicalRelation if l.catalogTable.isDefined =>
-        collectTransformers(l, l.catalogTable.get, aliases, outputs)
-      case _ => Seq.empty
-    }.toMap
-  }
-
-  private def doMasking(plan: LogicalPlan): LogicalPlan = plan match {
-    case s: Subquery => s
-    case m: SubmarineDataMasking => m // escape the optimize iteration if already masked
-    case fixed if fixed.find(_.isInstanceOf[SubmarineDataMasking]).nonEmpty => fixed
-    case _ =>
-      val aliases = collectAllAliases(plan)
-      val transformers = collectAllTransformers(plan, aliases)
-      val newPlan =
-        if (transformers.nonEmpty && plan.output.exists(o => transformers.get(o.exprId).nonEmpty)) {
-          val newOutput = plan.output.map(attr => transformers.getOrElse(attr.exprId, attr))
-          Project(newOutput, plan)
-        } else {
-          plan
-        }
-
-      // Call spark analysis here explicitly to resolve UnresolvedFunctions
-      val marked = analyzer.execute(newPlan) transformUp {
-        case p if hasCatalogTable(p) => SubmarineDataMasking(p)
-      }
-
-      // Extract global/local limit if any and apply after masking projection
-      val limitExpr: Option[Expression] = plan match {
-        case globalLimit: GlobalLimit => Some(globalLimit.limitExpr)
-        case localLimit: LocalLimit => Some(localLimit.limitExpr)
-        case _ => None
-      }
-
-      val markedWithLimit = if (limitExpr.isDefined) Limit(limitExpr.get, marked) else marked
-
-      markedWithLimit transformAllExpressions {
-        case s: SubqueryExpression =>
-          val SubqueryCompatible(newPlan, _) = SubqueryCompatible(
-              SubmarineDataMasking(s.plan), SubqueryExpression.hasCorrelatedSubquery(s))
-          s.withNewPlan(newPlan)
-      }
-  }
-
-  override def apply(plan: LogicalPlan): LogicalPlan = plan match {
-    case c: Command => c match {
-      case c: CreateDataSourceTableAsSelectCommand => c.copy(query = doMasking(c.query))
-      case c: CreateHiveTableAsSelectCommand => c.copy(query = doMasking(c.query))
-      case c: CreateViewCommand => c.copy(child = doMasking(c.child))
-      case i: InsertIntoDataSourceCommand => i.copy(query = doMasking(i.query))
-      case i: InsertIntoDataSourceDirCommand => i.copy(query = doMasking(i.query))
-      case i: InsertIntoHadoopFsRelationCommand => i.copy(query = doMasking(i.query))
-      case i: InsertIntoHiveDirCommand => i.copy(query = doMasking(i.query))
-      case i: InsertIntoHiveTable => i.copy(query = doMasking(i.query))
-      case s: SaveIntoDataSourceCommand => s.copy(query = doMasking(s.query))
-      case cmd => cmd
-    }
-    case other => doMasking(other)
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarinePushPredicatesThroughExtensions.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarinePushPredicatesThroughExtensions.scala
deleted file mode 100644
index fb442d9..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarinePushPredicatesThroughExtensions.scala
+++ /dev/null
@@ -1,34 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.catalyst.optimizer
-
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.catalyst.plans.logical.{Filter, LogicalPlan, SubmarineDataMasking, SubmarineRowFilter}
-import org.apache.spark.sql.catalyst.rules.Rule
-
-case class SubmarinePushPredicatesThroughExtensions(spark: SparkSession) extends Rule[LogicalPlan] {
-
-  override def apply(plan: LogicalPlan): LogicalPlan = plan transform {
-    case f @ Filter(_, SubmarineRowFilter(ch)) =>
-      SubmarineRowFilter(f.copy(child = ch))
-    case f @ Filter(_, SubmarineDataMasking(ch)) =>
-      SubmarineDataMasking(f.copy(child = ch))
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineRowFilterExtension.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineRowFilterExtension.scala
deleted file mode 100644
index fc439b5..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineRowFilterExtension.scala
+++ /dev/null
@@ -1,133 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.catalyst.optimizer
-
-import org.apache.commons.lang3.StringUtils
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.ranger.plugin.policyengine.RangerAccessResult
-import org.apache.spark.sql.AuthzUtils.getFieldVal
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.catalyst.catalog.CatalogTable
-import org.apache.spark.sql.catalyst.expressions.SubqueryExpression
-import org.apache.spark.sql.catalyst.plans.logical._
-import org.apache.spark.sql.catalyst.rules.Rule
-import org.apache.spark.sql.execution.command.{CreateDataSourceTableAsSelectCommand, CreateViewCommand, InsertIntoDataSourceDirCommand}
-import org.apache.spark.sql.execution.datasources.{InsertIntoDataSourceCommand, InsertIntoHadoopFsRelationCommand, LogicalRelation, SaveIntoDataSourceCommand}
-import org.apache.spark.sql.hive.execution.{CreateHiveTableAsSelectCommand, InsertIntoHiveDirCommand, InsertIntoHiveTable}
-
-import org.apache.submarine.spark.compatible.SubqueryCompatible
-import org.apache.submarine.spark.security._
-
-/**
- * An Apache Spark's [[Optimizer]] extension for row level filtering.
- */
-case class SubmarineRowFilterExtension(spark: SparkSession) extends Rule[LogicalPlan] {
-  private lazy val rangerSparkOptimizer = new SubmarineSparkOptimizer(spark)
-
-  /**
-   * Transform a Relation to a parsed [[LogicalPlan]] with specified row filter expressions
-   * @param plan the original [[LogicalPlan]]
-   * @param table a Spark [[CatalogTable]] representation
-   * @return A new Spark [[LogicalPlan]] with specified row filter expressions
-   */
-  private def applyingRowFilterExpr(plan: LogicalPlan, table: CatalogTable): LogicalPlan = {
-    val auditHandler = RangerSparkAuditHandler()
-    try {
-      val identifier = table.identifier
-      val resource =
-        RangerSparkResource(SparkObjectType.TABLE, identifier.database, identifier.table)
-      val ugi = UserGroupInformation.getCurrentUser
-      val request = new RangerSparkAccessRequest(resource, ugi.getShortUserName,
-        ugi.getGroupNames.toSet, SparkObjectType.TABLE.toString, SparkAccessType.SELECT,
-        RangerSparkPlugin.getClusterName)
-      val result = RangerSparkPlugin.evalRowFilterPolicies(request, auditHandler)
-      if (isRowFilterEnabled(result)) {
-        val condition = spark.sessionState.sqlParser.parseExpression(result.getFilterExpr)
-        val analyzed = spark.sessionState.analyzer.execute(Filter(condition, plan))
-        val optimized = analyzed transformAllExpressions {
-          case s: SubqueryExpression =>
-            val SubqueryCompatible(newPlan, _) = SubqueryCompatible(
-              SubmarineRowFilter(s.plan), SubqueryExpression.hasCorrelatedSubquery(s))
-            s.withNewPlan(rangerSparkOptimizer.execute(newPlan))
-        }
-        SubmarineRowFilter(optimized)
-      } else {
-        SubmarineRowFilter(plan)
-      }
-    } catch {
-      case e: Exception => throw e
-    }
-  }
-
-  private def isRowFilterEnabled(result: RangerAccessResult): Boolean = {
-    result != null && result.isRowFilterEnabled && StringUtils.isNotEmpty(result.getFilterExpr)
-  }
-
-  private def getPlanWithTables(plan: LogicalPlan): Map[LogicalPlan, CatalogTable] = {
-    plan.collectLeaves().map {
-      case h if h.nodeName == "HiveTableRelation" =>
-        h -> getFieldVal(h, "tableMeta").asInstanceOf[CatalogTable]
-      case m if m.nodeName == "MetastoreRelation" =>
-        m -> getFieldVal(m, "catalogTable").asInstanceOf[CatalogTable]
-      case l: LogicalRelation if l.catalogTable.isDefined =>
-        l -> l.catalogTable.get
-      case _ => null
-    }.filter(_ != null).toMap
-  }
-
-  private def isFixed(plan: LogicalPlan): Boolean = {
-    val markNum = plan.collect { case _: SubmarineRowFilter => true }.size
-    markNum >= getPlanWithTables(plan).size
-  }
-  private def doFiltering(plan: LogicalPlan): LogicalPlan = plan match {
-    case rf: SubmarineRowFilter => rf
-    case plan if isFixed(plan) => plan
-    case _ =>
-      val plansWithTables = getPlanWithTables(plan)
-        .map { case (plan, table) =>
-          (plan, applyingRowFilterExpr(plan, table))
-        }
-
-      plan transformUp {
-        case p => plansWithTables.getOrElse(p, p)
-      }
-  }
-
-  /**
-   * Transform a spark logical plan to another plan with the row filer expressions
-   * @param plan the original [[LogicalPlan]]
-   * @return the logical plan with row filer expressions applied
-   */
-  override def apply(plan: LogicalPlan): LogicalPlan = plan match {
-    case c: Command => c match {
-      case c: CreateDataSourceTableAsSelectCommand => c.copy(query = doFiltering(c.query))
-      case c: CreateHiveTableAsSelectCommand => c.copy(query = doFiltering(c.query))
-      case c: CreateViewCommand => c.copy(child = doFiltering(c.child))
-      case i: InsertIntoDataSourceCommand => i.copy(query = doFiltering(i.query))
-      case i: InsertIntoDataSourceDirCommand => i.copy(query = doFiltering(i.query))
-      case i: InsertIntoHadoopFsRelationCommand => i.copy(query = doFiltering(i.query))
-      case i: InsertIntoHiveDirCommand => i.copy(query = doFiltering(i.query))
-      case i: InsertIntoHiveTable => i.copy(query = doFiltering(i.query))
-      case s: SaveIntoDataSourceCommand => s.copy(query = doFiltering(s.query))
-      case cmd => cmd
-    }
-    case other => doFiltering(other)
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineSparkOptimizer.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineSparkOptimizer.scala
deleted file mode 100644
index 36811ce..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineSparkOptimizer.scala
+++ /dev/null
@@ -1,39 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.catalyst.optimizer
-
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
-import org.apache.spark.sql.catalyst.rules.RuleExecutor
-
-/**
- * An Optimizer without all `spark.sql.extensions`
- */
-class SubmarineSparkOptimizer(spark: SparkSession) extends RuleExecutor[LogicalPlan] {
-
-  override def batches: Seq[Batch] = {
-    val optimizer = spark.sessionState.optimizer
-    val extRules = optimizer.extendedOperatorOptimizationRules
-    optimizer.batches.map { batch =>
-      val ruleSet = batch.rules.toSet -- extRules
-      Batch(batch.name, FixedPoint(batch.strategy.maxIterations), ruleSet.toSeq: _*)
-    }
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineSparkRangerAuthorizationExtension.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineSparkRangerAuthorizationExtension.scala
deleted file mode 100644
index 68dc720..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/optimizer/SubmarineSparkRangerAuthorizationExtension.scala
+++ /dev/null
@@ -1,173 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.catalyst.optimizer
-
-import org.apache.commons.logging.LogFactory
-import org.apache.spark.sql.SparkSession
-import org.apache.spark.sql.catalyst.plans.logical.{Command, LogicalPlan}
-import org.apache.spark.sql.catalyst.rules.Rule
-import org.apache.spark.sql.execution.{SubmarineShowDatabasesCommand, SubmarineShowTablesCommand}
-import org.apache.spark.sql.execution.command._
-import org.apache.spark.sql.execution.datasources.{CreateTempViewUsing, InsertIntoDataSourceCommand, InsertIntoHadoopFsRelationCommand}
-import org.apache.spark.sql.hive.PrivilegesBuilder
-import org.apache.spark.sql.hive.execution.CreateHiveTableAsSelectCommand
-
-import org.apache.submarine.spark.compatible.CompatibleCommand._
-import org.apache.submarine.spark.security.{RangerSparkAuthorizer, SparkAccessControlException}
-
-/**
- * An Optimizer Rule to do SQL standard ACL for Spark SQL.
- *
- * For Apache Spark 2.3.x and later
- */
-case class SubmarineSparkRangerAuthorizationExtension(spark: SparkSession)
-  extends Rule[LogicalPlan] {
-  import org.apache.submarine.spark.security.SparkOperationType._
-
-  private val LOG = LogFactory.getLog(classOf[SubmarineSparkRangerAuthorizationExtension])
-
-  /**
-   * Visit the [[LogicalPlan]] recursively to get all spark privilege objects, check the privileges
-   *
-   * If the user is authorized, then the original plan will be returned; otherwise, interrupted by
-   * some particular privilege exceptions.
-   * @param plan a spark LogicalPlan for verifying privileges
-   * @return a plan itself which has gone through the privilege check.
-   */
-  override def apply(plan: LogicalPlan): LogicalPlan = {
-    plan match {
-      case s: ShowDatabasesCommandCompatible => SubmarineShowDatabasesCommand(s)
-      case s: SubmarineShowDatabasesCommand => s
-      case s: ShowTablesCommand => SubmarineShowTablesCommand(s)
-      case s: SubmarineShowTablesCommand => s
-      case ResetCommand => SubmarineResetCommand
-      case _ =>
-        val operationType: SparkOperationType = toOperationType(plan)
-        val (in, out) = PrivilegesBuilder.build(plan)
-        try {
-          RangerSparkAuthorizer.checkPrivileges(spark, operationType, in, out)
-          plan
-        } catch {
-          case ace: SparkAccessControlException =>
-            LOG.error(
-              s"""
-                 |+===============================+
-                 ||Spark SQL Authorization Failure|
-                 ||-------------------------------|
-                 ||${ace.getMessage}
-                 ||-------------------------------|
-                 ||Spark SQL Authorization Failure|
-                 |+===============================+
-               """.stripMargin)
-            throw ace
-        }
-    }
-  }
-
-  /**
-   * Mapping of [[LogicalPlan]] -> [[SparkOperationType]]
-   * @param plan a spark LogicalPlan
-   * @return
-   */
-  private def toOperationType(plan: LogicalPlan): SparkOperationType = {
-    plan match {
-      case c: Command => c match {
-        case _: AlterDatabasePropertiesCommand => ALTERDATABASE
-        case p if p.nodeName == "AlterTableAddColumnsCommand" => ALTERTABLE_ADDCOLS
-        case _: AlterTableAddPartitionCommand => ALTERTABLE_ADDPARTS
-        case p if p.nodeName == "AlterTableChangeColumnCommand" => ALTERTABLE_RENAMECOL
-        case _: AlterTableDropPartitionCommand => ALTERTABLE_DROPPARTS
-        case _: AlterTableRecoverPartitionsCommand => MSCK
-        case _: AlterTableRenamePartitionCommand => ALTERTABLE_RENAMEPART
-        case a: AlterTableRenameCommand => if (!a.isView) ALTERTABLE_RENAME else ALTERVIEW_RENAME
-        case _: AlterTableSetPropertiesCommand
-             | _: AlterTableUnsetPropertiesCommand => ALTERTABLE_PROPERTIES
-        case _: AlterTableSerDePropertiesCommand => ALTERTABLE_SERDEPROPERTIES
-        case _: AlterTableSetLocationCommand => ALTERTABLE_LOCATION
-        case _: AlterViewAsCommand => QUERY
-
-        case _: AnalyzeColumnCommand => QUERY
-        // case _: AnalyzeTableCommand => HiveOperation.ANALYZE_TABLE
-        // Hive treat AnalyzeTableCommand as QUERY, obey it.
-        case _: AnalyzeTableCommand => QUERY
-        case p if p.nodeName == "AnalyzePartitionCommand" => QUERY
-
-        case _: CreateDatabaseCommand => CREATEDATABASE
-        case _: CreateDataSourceTableAsSelectCommand
-             | _: CreateHiveTableAsSelectCommand => CREATETABLE_AS_SELECT
-        case _: CreateFunctionCommand => CREATEFUNCTION
-        case _: CreateTableCommand
-             | _: CreateDataSourceTableCommand => CREATETABLE
-        case _: CreateTableLikeCommand => CREATETABLE
-        case _: CreateViewCommand
-             | _: CacheTableCommand
-             | _: CreateTempViewUsing => CREATEVIEW
-
-        case p if p.nodeName == "DescribeColumnCommand" => DESCTABLE
-        case _: DescribeDatabaseCommand => DESCDATABASE
-        case _: DescribeFunctionCommand => DESCFUNCTION
-        case _: DescribeTableCommand => DESCTABLE
-
-        case _: DropDatabaseCommand => DROPDATABASE
-        // Hive don't check privileges for `drop function command`, what about a unverified user
-        // try to drop functions.
-        // We treat permanent functions as tables for verifying.
-        case d: DropFunctionCommand if !d.isTemp => DROPTABLE
-        case d: DropFunctionCommand if d.isTemp => DROPFUNCTION
-        case _: DropTableCommand => DROPTABLE
-
-        case e: ExplainCommand => toOperationType(e.logicalPlan)
-
-        case _: InsertIntoDataSourceCommand => QUERY
-        case p if p.nodeName == "InsertIntoDataSourceDirCommand" => QUERY
-        case _: InsertIntoHadoopFsRelationCommand => CREATETABLE_AS_SELECT
-        case p if p.nodeName == "InsertIntoHiveDirCommand" => QUERY
-        case p if p.nodeName == "InsertIntoHiveTable" => QUERY
-
-        case _: LoadDataCommand => LOAD
-
-        case p if p.nodeName == "SaveIntoDataSourceCommand" => QUERY
-        case s: SetCommand if s.kv.isEmpty || s.kv.get._2.isEmpty => SHOWCONF
-        case _: SetDatabaseCommandCompatible => SWITCHDATABASE
-        case _: ShowCreateTableCommand => SHOW_CREATETABLE
-        case _: ShowColumnsCommand => SHOWCOLUMNS
-        case _: ShowDatabasesCommandCompatible => SHOWDATABASES
-        case _: ShowFunctionsCommand => SHOWFUNCTIONS
-        case _: ShowPartitionsCommand => SHOWPARTITIONS
-        case _: ShowTablesCommand => SHOWTABLES
-        case _: ShowTablePropertiesCommand => SHOW_TBLPROPERTIES
-        case s: StreamingExplainCommand =>
-          toOperationType(s.queryExecution.optimizedPlan)
-
-        case _: TruncateTableCommand => TRUNCATETABLE
-
-        case _: UncacheTableCommand => DROPVIEW
-
-        // Commands that do not need build privilege goes as explain type
-        case _ =>
-          // AddFileCommand
-          // AddJarCommand
-          // ...
-          EXPLAIN
-      }
-      case _ => QUERY
-    }
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/SubmarineDataMasking.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/SubmarineDataMasking.scala
deleted file mode 100644
index 10d1c17..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/SubmarineDataMasking.scala
+++ /dev/null
@@ -1,30 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.catalyst.plans.logical
-
-import org.apache.spark.sql.catalyst.expressions.Attribute
-
-/**
- * A marker [[LogicalPlan]] for column data masking, which will be removed during
- * LogicalPlan -> PhysicalPlan
- */
-case class SubmarineDataMasking(child: LogicalPlan) extends UnaryNode {
-  override def output: Seq[Attribute] = child.output
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/SubmarineRowFilter.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/SubmarineRowFilter.scala
deleted file mode 100644
index dec2752..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/SubmarineRowFilter.scala
+++ /dev/null
@@ -1,31 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.catalyst.plans.logical
-
-import org.apache.spark.sql.catalyst.expressions.Attribute
-
-/**
- * A wrapper for a transformed plan with row level filter applied, which will be removed during
- * LogicalPlan -> PhysicalPlan
- *
- */
-case class SubmarineRowFilter(child: LogicalPlan) extends UnaryNode {
-  override def output: Seq[Attribute] = child.output
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/SubmarineShowDatabasesCommand.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/SubmarineShowDatabasesCommand.scala
deleted file mode 100644
index dd782f0..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/SubmarineShowDatabasesCommand.scala
+++ /dev/null
@@ -1,45 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.execution
-
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.execution.command.RunnableCommand
-
-import org.apache.submarine.spark.compatible.CompatibleCommand.ShowDatabasesCommandCompatible
-import org.apache.submarine.spark.compatible.CompatibleFunc
-import org.apache.submarine.spark.security.{RangerSparkAuthorizer, SparkPrivilegeObject, SparkPrivilegeObjectType}
-
-case class SubmarineShowDatabasesCommand(child: ShowDatabasesCommandCompatible)
-  extends RunnableCommand {
-  override val output = child.output
-
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    val catalog = sparkSession.sessionState.catalog
-    val databases = CompatibleFunc.getPattern(child)
-      .map(catalog.listDatabases).getOrElse(catalog.listDatabases()).map { d => Row(d) }
-
-    databases.filter(r => RangerSparkAuthorizer.isAllowed(toSparkPrivilegeObject(r)))
-  }
-
-  private def toSparkPrivilegeObject(row: Row): SparkPrivilegeObject = {
-    val database = row.getString(0)
-    new SparkPrivilegeObject(SparkPrivilegeObjectType.DATABASE, database, database)
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/SubmarineShowTablesCommand.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/SubmarineShowTablesCommand.scala
deleted file mode 100644
index b32e132..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/SubmarineShowTablesCommand.scala
+++ /dev/null
@@ -1,41 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.execution
-
-import org.apache.spark.sql.{Row, SparkSession}
-import org.apache.spark.sql.catalyst.expressions.Attribute
-import org.apache.spark.sql.execution.command.{RunnableCommand, ShowTablesCommand}
-
-import org.apache.submarine.spark.security.{RangerSparkAuthorizer, SparkPrivilegeObject, SparkPrivilegeObjectType}
-
-case class SubmarineShowTablesCommand(child: ShowTablesCommand) extends RunnableCommand {
-
-  override val output: Seq[Attribute] = child.output
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    val rows = child.run(sparkSession)
-    rows.filter(r => RangerSparkAuthorizer.isAllowed(toSparkPrivilegeObject(r)))
-  }
-
-  private def toSparkPrivilegeObject(row: Row): SparkPrivilegeObject = {
-    val database = row.getString(0)
-    val table = row.getString(1)
-    new SparkPrivilegeObject(SparkPrivilegeObjectType.TABLE_OR_VIEW, database, table)
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/SubmarineSparkPlanOmitStrategy.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/SubmarineSparkPlanOmitStrategy.scala
deleted file mode 100644
index 6c59b7b..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/SubmarineSparkPlanOmitStrategy.scala
+++ /dev/null
@@ -1,35 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.execution
-
-import org.apache.spark.sql.{SparkSession, Strategy}
-import org.apache.spark.sql.catalyst.plans.logical.{LogicalPlan, SubmarineDataMasking, SubmarineRowFilter}
-
-/**
- * An Apache Spark's [[Strategy]] extension for omitting marker for row level filtering and data
- * masking.
- */
-case class SubmarineSparkPlanOmitStrategy(spark: SparkSession) extends Strategy {
-  override def apply(plan: LogicalPlan): Seq[SparkPlan] = plan match {
-    case SubmarineRowFilter(child) => planLater(child) :: Nil
-    case SubmarineDataMasking(child) => planLater(child) :: Nil
-    case _ => Nil
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/command/SubmarineResetCommand.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/command/SubmarineResetCommand.scala
deleted file mode 100644
index 837624e..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/execution/command/SubmarineResetCommand.scala
+++ /dev/null
@@ -1,38 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.execution.command
-
-import org.apache.spark.sql.{Row, SparkSession}
-
-/**
- * Runtime replacement for spark's original [[ResetCommand]], since the operation will
- * wipe out all configuration including our security-specific ones
- * see: https://issues.apache.org/jira/browse/SPARK-31234
- */
-case object SubmarineResetCommand extends RunnableCommand {
-  override def run(sparkSession: SparkSession): Seq[Row] = {
-    val conf = sparkSession.sessionState.conf
-    conf.clear()
-    sparkSession.sparkContext.getConf.getAll.foreach { case (k, v) =>
-      conf.setConfString(k, v)
-    }
-    Seq.empty[Row]
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/hive/PrivilegesBuilder.scala b/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/hive/PrivilegesBuilder.scala
deleted file mode 100644
index 958223f..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/spark/sql/hive/PrivilegesBuilder.scala
+++ /dev/null
@@ -1,467 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.spark.sql.hive
-
-import scala.collection.mutable.ArrayBuffer
-
-import org.apache.spark.sql.AuthzUtils._
-import org.apache.spark.sql.SaveMode
-import org.apache.spark.sql.catalyst.TableIdentifier
-import org.apache.spark.sql.catalyst.analysis.UnresolvedRelation
-import org.apache.spark.sql.catalyst.catalog.CatalogTable
-import org.apache.spark.sql.catalyst.expressions.NamedExpression
-import org.apache.spark.sql.catalyst.plans.logical.{Command, LogicalPlan, Project}
-import org.apache.spark.sql.execution.command._
-import org.apache.spark.sql.execution.datasources.{InsertIntoDataSourceCommand, InsertIntoHadoopFsRelationCommand, LogicalRelation}
-import org.apache.spark.sql.hive.execution.CreateHiveTableAsSelectCommand
-import org.apache.spark.sql.types.StructField
-
-import org.apache.submarine.spark.compatible.{CompatibleFunc, PersistedViewCompatible}
-import org.apache.submarine.spark.compatible.CompatibleCommand.SetDatabaseCommandCompatible
-import org.apache.submarine.spark.security.{SparkPrivilegeObject, SparkPrivilegeObjectType, SparkPrivObjectActionType}
-import org.apache.submarine.spark.security.SparkPrivObjectActionType.SparkPrivObjectActionType
-
-
-/**
- * [[LogicalPlan]] -> list of [[SparkPrivilegeObject]]s
- */
-private[sql] object PrivilegesBuilder {
-
-  /**
-   * Build input and output privilege objects from a Spark's [[LogicalPlan]]
-   *
-   * For [[ExplainCommand]]s, build its child.
-   * For [[RunnableCommand]]s, build outputs if it has an target to write, build inputs for the
-   * inside query if exists.
-   *
-   * For other queries, build inputs.
-   *
-   * @param plan A Spark [[LogicalPlan]]
-   */
-  def build(plan: LogicalPlan): (Seq[SparkPrivilegeObject], Seq[SparkPrivilegeObject]) = {
-
-    def doBuild(plan: LogicalPlan): (Seq[SparkPrivilegeObject], Seq[SparkPrivilegeObject]) = {
-      val inputObjs = new ArrayBuffer[SparkPrivilegeObject]
-      val outputObjs = new ArrayBuffer[SparkPrivilegeObject]
-      plan match {
-        // RunnableCommand
-        case cmd: Command => buildCommand(cmd, inputObjs, outputObjs)
-        // Queries
-        case _ => buildQuery(plan, inputObjs)
-      }
-      (inputObjs, outputObjs)
-    }
-
-    plan match {
-      case e: ExplainCommand => doBuild(e.logicalPlan)
-      case p => doBuild(p)
-    }
-  }
-
-  /**
-   * Build SparkPrivilegeObjects from Spark LogicalPlan
-   * @param plan a Spark LogicalPlan used to generate SparkPrivilegeObjects
-   * @param privilegeObjects input or output spark privilege object list
-   * @param projectionList Projection list after pruning
-   */
-  private def buildQuery(
-       plan: LogicalPlan,
-       privilegeObjects: ArrayBuffer[SparkPrivilegeObject],
-       projectionList: Seq[NamedExpression] = Nil): Unit = {
-
-    /**
-     * Columns in Projection take priority for column level privilege checking
-     * @param table catalogTable of a given relation
-     */
-    def mergeProjection(table: CatalogTable): Unit = {
-      if (projectionList.isEmpty) {
-        addTableOrViewLevelObjs(
-          table.identifier,
-          privilegeObjects,
-          table.partitionColumnNames,
-          table.schema.fieldNames)
-      } else {
-        addTableOrViewLevelObjs(
-          table.identifier,
-          privilegeObjects,
-          table.partitionColumnNames.filter(projectionList.map(_.name).contains(_)),
-          projectionList.map(_.name))
-      }
-    }
-
-    plan match {
-      case p: Project => buildQuery(p.child, privilegeObjects, p.projectList)
-
-      case h if h.nodeName == "HiveTableRelation" =>
-        mergeProjection(getFieldVal(h, "tableMeta").asInstanceOf[CatalogTable])
-
-      case m if m.nodeName == "MetastoreRelation" =>
-        mergeProjection(getFieldVal(m, "catalogTable").asInstanceOf[CatalogTable])
-
-      case l: LogicalRelation if l.catalogTable.nonEmpty => mergeProjection(l.catalogTable.get)
-
-      case u: UnresolvedRelation =>
-        // Normally, we shouldn't meet UnresolvedRelation here in an optimized plan.
-        // Unfortunately, the real world is always a place where miracles happen.
-        // We check the privileges directly without resolving the plan and leave everything
-        // to spark to do.
-        addTableOrViewLevelObjs(CompatibleFunc.tableIdentifier(u), privilegeObjects)
-
-      case p =>
-        for (child <- p.children) {
-          buildQuery(child, privilegeObjects, projectionList)
-        }
-    }
-  }
-
-  /**
-   * Build SparkPrivilegeObjects from Spark LogicalPlan
-   * @param plan a Spark LogicalPlan used to generate SparkPrivilegeObjects
-   * @param inputObjs input spark privilege object list
-   * @param outputObjs output spark privilege object list
-   */
-  private def buildCommand(
-      plan: LogicalPlan,
-      inputObjs: ArrayBuffer[SparkPrivilegeObject],
-      outputObjs: ArrayBuffer[SparkPrivilegeObject]): Unit = {
-    plan match {
-      case a: AlterDatabasePropertiesCommand => addDbLevelObjs(a.databaseName, outputObjs)
-
-      case a if a.nodeName == "AlterTableAddColumnsCommand" =>
-        addTableOrViewLevelObjs(
-          getFieldVal(a, "table").asInstanceOf[TableIdentifier],
-          inputObjs,
-          columns = getFieldVal(a, "colsToAdd").asInstanceOf[Seq[StructField]].map(_.name))
-        addTableOrViewLevelObjs(
-          getFieldVal(a, "table").asInstanceOf[TableIdentifier],
-          outputObjs,
-          columns = getFieldVal(a, "colsToAdd").asInstanceOf[Seq[StructField]].map(_.name))
-
-      case a: AlterTableAddPartitionCommand =>
-        addTableOrViewLevelObjs(a.tableName, inputObjs)
-        addTableOrViewLevelObjs(a.tableName, outputObjs)
-
-      case a if a.nodeName == "AlterTableChangeColumnCommand" =>
-        addTableOrViewLevelObjs(
-          getFieldVal(a, "tableName").asInstanceOf[TableIdentifier],
-          inputObjs,
-          columns = Seq(getFieldVal(a, "columnName").asInstanceOf[String]))
-
-      case a: AlterTableDropPartitionCommand =>
-        addTableOrViewLevelObjs(a.tableName, inputObjs)
-        addTableOrViewLevelObjs(a.tableName, outputObjs)
-
-      case a: AlterTableRecoverPartitionsCommand =>
-        addTableOrViewLevelObjs(a.tableName, inputObjs)
-        addTableOrViewLevelObjs(a.tableName, outputObjs)
-
-      case a: AlterTableRenameCommand if !a.isView || a.oldName.database.nonEmpty =>
-        // rename tables / permanent views
-        addTableOrViewLevelObjs(a.oldName, inputObjs)
-        addTableOrViewLevelObjs(a.newName, outputObjs)
-
-      case a: AlterTableRenamePartitionCommand =>
-        addTableOrViewLevelObjs(a.tableName, inputObjs)
-        addTableOrViewLevelObjs(a.tableName, outputObjs)
-
-      case a: AlterTableSerDePropertiesCommand =>
-        addTableOrViewLevelObjs(a.tableName, inputObjs)
-        addTableOrViewLevelObjs(a.tableName, outputObjs)
-
-      case a: AlterTableSetLocationCommand =>
-        addTableOrViewLevelObjs(a.tableName, inputObjs)
-        addTableOrViewLevelObjs(a.tableName, outputObjs)
-
-      case a: AlterTableSetPropertiesCommand =>
-        addTableOrViewLevelObjs(a.tableName, inputObjs)
-        addTableOrViewLevelObjs(a.tableName, outputObjs)
-
-      case a: AlterTableUnsetPropertiesCommand =>
-        addTableOrViewLevelObjs(a.tableName, inputObjs)
-        addTableOrViewLevelObjs(a.tableName, outputObjs)
-
-      case a: AlterViewAsCommand =>
-        if (a.name.database.nonEmpty) {
-          // it's a permanent view
-          addTableOrViewLevelObjs(a.name, outputObjs)
-        }
-        buildQuery(a.query, inputObjs)
-
-      case a: AnalyzeColumnCommand =>
-        addTableOrViewLevelObjs(
-          a.tableIdent, inputObjs, columns = CompatibleFunc.analyzeColumnName(a))
-        addTableOrViewLevelObjs(
-          a.tableIdent, outputObjs, columns = CompatibleFunc.analyzeColumnName(a))
-
-      case a if a.nodeName == "AnalyzePartitionCommand" =>
-        addTableOrViewLevelObjs(
-          getFieldVal(a, "tableIdent").asInstanceOf[TableIdentifier], inputObjs)
-        addTableOrViewLevelObjs(
-          getFieldVal(a, "tableIdent").asInstanceOf[TableIdentifier], outputObjs)
-
-      case a: AnalyzeTableCommand =>
-        addTableOrViewLevelObjs(a.tableIdent, inputObjs, columns = Seq("RAW__DATA__SIZE"))
-        addTableOrViewLevelObjs(a.tableIdent, outputObjs)
-
-      case c: CacheTableCommand => c.plan.foreach {
-        buildQuery(_, inputObjs)
-      }
-
-      case c: CreateDatabaseCommand => addDbLevelObjs(c.databaseName, outputObjs)
-
-      case c: CreateDataSourceTableAsSelectCommand =>
-        addDbLevelObjs(c.table.identifier, outputObjs)
-        addTableOrViewLevelObjs(c.table.identifier, outputObjs, mode = c.mode)
-        buildQuery(c.query, inputObjs)
-
-      case c: CreateDataSourceTableCommand =>
-        addTableOrViewLevelObjs(c.table.identifier, outputObjs)
-
-      case c: CreateFunctionCommand if !c.isTemp =>
-        addDbLevelObjs(c.databaseName, outputObjs)
-        addFunctionLevelObjs(c.databaseName, c.functionName, outputObjs)
-
-      case c: CreateHiveTableAsSelectCommand =>
-        addDbLevelObjs(c.tableDesc.identifier, outputObjs)
-        addTableOrViewLevelObjs(c.tableDesc.identifier, outputObjs)
-        buildQuery(c.query, inputObjs)
-
-      case c: CreateTableCommand => addTableOrViewLevelObjs(c.table.identifier, outputObjs)
-
-      case c: CreateTableLikeCommand =>
-        addDbLevelObjs(c.targetTable, outputObjs)
-        addTableOrViewLevelObjs(c.targetTable, outputObjs)
-        // hive don't handle source table's privileges, we should not obey that, because
-        // it will cause meta information leak
-        addDbLevelObjs(c.sourceTable, inputObjs)
-        addTableOrViewLevelObjs(c.sourceTable, inputObjs)
-
-      case c: CreateViewCommand =>
-        c.viewType match {
-          case PersistedViewCompatible.obj =>
-            // PersistedView will be tied to a database
-            addDbLevelObjs(c.name, outputObjs)
-            addTableOrViewLevelObjs(c.name, outputObjs)
-          case _ =>
-        }
-        buildQuery(c.child, inputObjs)
-
-      case d if d.nodeName == "DescribeColumnCommand" =>
-        addTableOrViewLevelObjs(
-          getFieldVal(d, "table").asInstanceOf[TableIdentifier],
-          inputObjs,
-          columns = getFieldVal(d, "colNameParts").asInstanceOf[Seq[String]])
-
-      case d: DescribeDatabaseCommand =>
-        addDbLevelObjs(d.databaseName, inputObjs)
-
-      case d: DescribeFunctionCommand =>
-        addFunctionLevelObjs(d.functionName.database, d.functionName.funcName, inputObjs)
-
-      case d: DescribeTableCommand => addTableOrViewLevelObjs(d.table, inputObjs)
-
-      case d: DropDatabaseCommand =>
-        // outputObjs are enough for privilege check, adding inputObjs for consistency with hive
-        // behaviour in case of some unexpected issues.
-        addDbLevelObjs(d.databaseName, inputObjs)
-        addDbLevelObjs(d.databaseName, outputObjs)
-
-      case d: DropFunctionCommand =>
-        addFunctionLevelObjs(d.databaseName, d.functionName, outputObjs)
-
-      case d: DropTableCommand => addTableOrViewLevelObjs(d.tableName, outputObjs)
-
-      case i: InsertIntoDataSourceCommand =>
-        i.logicalRelation.catalogTable.foreach { table =>
-          addTableOrViewLevelObjs(
-            table.identifier,
-            outputObjs)
-        }
-        buildQuery(i.query, inputObjs)
-
-      case i if i.nodeName =="InsertIntoDataSourceDirCommand" =>
-        buildQuery(getFieldVal(i, "query").asInstanceOf[LogicalPlan], inputObjs)
-
-      case i: InsertIntoHadoopFsRelationCommand =>
-        // we are able to get the override mode here, but ctas for hive table with text/orc
-        // format and parquet with spark.sql.hive.convertMetastoreParquet=false can success
-        // with privilege checking without claiming for UPDATE privilege of target table,
-        // which seems to be same with Hive behaviour.
-        // So, here we ignore the overwrite mode for such a consistency.
-        i.catalogTable foreach { t =>
-          addTableOrViewLevelObjs(
-            t.identifier,
-            outputObjs,
-            i.partitionColumns.map(_.name),
-            t.schema.fieldNames)
-        }
-        buildQuery(i.query, inputObjs)
-
-      case i if i.nodeName == "InsertIntoHiveDirCommand" =>
-        buildQuery(getFieldVal(i, "query").asInstanceOf[LogicalPlan], inputObjs)
-
-      case i if i.nodeName == "InsertIntoHiveTable" =>
-        addTableOrViewLevelObjs(
-          getFieldVal(i, "table").asInstanceOf[CatalogTable].identifier, outputObjs)
-        buildQuery(getFieldVal(i, "query").asInstanceOf[LogicalPlan], inputObjs)
-
-      case l: LoadDataCommand =>
-        addTableOrViewLevelObjs(l.table, outputObjs)
-        if (!l.isLocal) {
-          inputObjs += new SparkPrivilegeObject(SparkPrivilegeObjectType.DFS_URI, l.path, l.path)
-        }
-
-      case s if s.nodeName == "SaveIntoDataSourceCommand" =>
-        buildQuery(getFieldVal(s, "query").asInstanceOf[LogicalPlan], outputObjs)
-
-      case s: SetDatabaseCommandCompatible =>
-        addDbLevelObjs(CompatibleFunc.getCatLogName(s), inputObjs)
-
-      case s: ShowColumnsCommand => addTableOrViewLevelObjs(s.tableName, inputObjs)
-
-      case s: ShowCreateTableCommand => addTableOrViewLevelObjs(s.table, inputObjs)
-
-      case s: ShowFunctionsCommand => s.db.foreach(addDbLevelObjs(_, inputObjs))
-
-      case s: ShowPartitionsCommand => addTableOrViewLevelObjs(s.tableName, inputObjs)
-
-      case s: ShowTablePropertiesCommand => addTableOrViewLevelObjs(s.table, inputObjs)
-
-      case s: ShowTablesCommand => addDbLevelObjs(s.databaseName, inputObjs)
-
-      case s: TruncateTableCommand => addTableOrViewLevelObjs(s.tableName, outputObjs)
-
-      case _ =>
-      // AddFileCommand
-      // AddJarCommand
-      // AnalyzeColumnCommand
-      // ClearCacheCommand
-      // CreateTempViewUsing
-      // ListFilesCommand
-      // ListJarsCommand
-      // RefreshTable
-      // RefreshTable
-      // ResetCommand
-      // SetCommand
-      // ShowDatabasesCommand
-      // StreamingExplainCommand
-      // UncacheTableCommand
-    }
-  }
-
-  /**
-   * Add database level spark privilege objects to input or output list
-   * @param dbName database name as spark privilege object
-   * @param privilegeObjects input or output list
-   */
-  private def addDbLevelObjs(
-      dbName: String,
-      privilegeObjects: ArrayBuffer[SparkPrivilegeObject]): Unit = {
-    privilegeObjects += new SparkPrivilegeObject(SparkPrivilegeObjectType.DATABASE, dbName, dbName)
-  }
-
-  /**
-   * Add database level spark privilege objects to input or output list
-   * @param dbOption an option of database name as spark privilege object
-   * @param privilegeObjects input or output spark privilege object list
-   */
-  private def addDbLevelObjs(
-      dbOption: Option[String],
-      privilegeObjects: ArrayBuffer[SparkPrivilegeObject]): Unit = {
-    dbOption match {
-      case Some(db) =>
-        privilegeObjects += new SparkPrivilegeObject(SparkPrivilegeObjectType.DATABASE, db, db)
-      case _ =>
-    }
-  }
-
-  /**
-   * Add database level spark privilege objects to input or output list
-   * @param identifier table identifier contains database name as hive privilege object
-   * @param privilegeObjects input or output spark privilege object list
-   */
-  private def addDbLevelObjs(
-      identifier: TableIdentifier,
-      privilegeObjects: ArrayBuffer[SparkPrivilegeObject]): Unit = {
-    identifier.database match {
-      case Some(db) =>
-        privilegeObjects += new SparkPrivilegeObject(SparkPrivilegeObjectType.DATABASE, db, db)
-      case _ =>
-    }
-  }
-
-  /**
-   * Add function level spark privilege objects to input or output list
-   * @param databaseName database name
-   * @param functionName function name as spark privilege object
-   * @param privilegeObjects input or output list
-   */
-  private def addFunctionLevelObjs(
-      databaseName: Option[String],
-      functionName: String,
-      privilegeObjects: ArrayBuffer[SparkPrivilegeObject]): Unit = {
-    databaseName match {
-      case Some(db) =>
-        privilegeObjects += new SparkPrivilegeObject(
-          SparkPrivilegeObjectType.FUNCTION, db, functionName)
-      case _ =>
-    }
-  }
-
-  /**
-   * Add table level spark privilege objects to input or output list
-   * @param identifier table identifier contains database name, and table name as hive
-   *                        privilege object
-   * @param privilegeObjects input or output list
-   * @param mode Append or overwrite
-   */
-  private def addTableOrViewLevelObjs(
-      identifier: TableIdentifier,
-      privilegeObjects: ArrayBuffer[SparkPrivilegeObject],
-      partKeys: Seq[String] = Nil,
-      columns: Seq[String] = Nil, mode: SaveMode = SaveMode.ErrorIfExists): Unit = {
-    identifier.database match {
-      case Some(db) =>
-        val tbName = identifier.table
-        val actionType = toActionType(mode)
-        privilegeObjects += new SparkPrivilegeObject(
-          SparkPrivilegeObjectType.TABLE_OR_VIEW,
-          db,
-          tbName,
-          partKeys,
-          columns,
-          actionType)
-      case _ =>
-    }
-  }
-
-  /**
-   * SparkPrivObjectActionType INSERT or INSERT_OVERWRITE
-   *
-   * @param mode Append or Overwrite
-   */
-  private def toActionType(mode: SaveMode): SparkPrivObjectActionType = {
-    mode match {
-      case SaveMode.Append => SparkPrivObjectActionType.INSERT
-      case SaveMode.Overwrite => SparkPrivObjectActionType.INSERT_OVERWRITE
-      case _ => SparkPrivObjectActionType.OTHER
-    }
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkAccessRequest.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkAccessRequest.scala
deleted file mode 100644
index c9383b1..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkAccessRequest.scala
+++ /dev/null
@@ -1,91 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import java.util.{Date, Locale}
-
-import scala.collection.JavaConverters._
-
-import org.apache.ranger.plugin.policyengine.{RangerAccessRequestImpl, RangerPolicyEngine}
-import org.apache.ranger.plugin.util.RangerAccessRequestUtil
-
-import org.apache.submarine.spark.security.SparkAccessType.SparkAccessType
-
-class RangerSparkAccessRequest private extends RangerAccessRequestImpl {
-
-  private var accessType = SparkAccessType.NONE
-
-  def this(
-      resource: RangerSparkResource,
-      user: String,
-      groups: Set[String],
-      opType: String,
-      accessType: SparkAccessType,
-      clusterName: String) {
-    this()
-    this.setResource(resource)
-    this.setUser(user)
-    this.setUserGroups(groups.asJava)
-    this.setAccessTime(new Date)
-    this.setAction(opType)
-    this.setSparkAccessType(accessType)
-    this.setUser(user)
-    this.setClusterName(clusterName)
-  }
-
-  def this(
-      resource: RangerSparkResource,
-      user: String,
-      groups: Set[String],
-      clusterName: String) = {
-    this(resource, user, groups, "METADATA OPERATION", SparkAccessType.USE, clusterName)
-  }
-
-  def getSparkAccessType: SparkAccessType = accessType
-
-  def setSparkAccessType(accessType: SparkAccessType): Unit = {
-    this.accessType = accessType
-    accessType match {
-      case SparkAccessType.USE => this.setAccessType(RangerPolicyEngine.ANY_ACCESS)
-      case SparkAccessType.ADMIN => this.setAccessType(RangerPolicyEngine.ADMIN_ACCESS)
-      case _ => this.setAccessType(accessType.toString.toLowerCase(Locale.ROOT))
-    }
-  }
-
-  def copy(): RangerSparkAccessRequest = {
-    val ret = new RangerSparkAccessRequest()
-    ret.setResource(getResource)
-    ret.setAccessType(getAccessType)
-    ret.setUser(getUser)
-    ret.setUserGroups(getUserGroups)
-    ret.setAccessTime(getAccessTime)
-    ret.setAction(getAction)
-    ret.setClientIPAddress(getClientIPAddress)
-    ret.setRemoteIPAddress(getRemoteIPAddress)
-    ret.setForwardedAddresses(getForwardedAddresses)
-    ret.setRequestData(getRequestData)
-    ret.setClientType(getClientType)
-    ret.setSessionId(getSessionId)
-    ret.setContext(RangerAccessRequestUtil.copyContext(getContext))
-    ret.accessType = accessType
-    ret.setClusterName(getClusterName)
-    ret
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkAuditHandler.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkAuditHandler.scala
deleted file mode 100644
index fff830a..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkAuditHandler.scala
+++ /dev/null
@@ -1,28 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import org.apache.ranger.plugin.audit.RangerDefaultAuditHandler
-
-case class RangerSparkAuditHandler() extends RangerDefaultAuditHandler {
-
-  // TODO(Kent Yao): Implementing meaningfully audit functions
-
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkAuthorizer.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkAuthorizer.scala
deleted file mode 100644
index 70fa093..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkAuthorizer.scala
+++ /dev/null
@@ -1,303 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import java.util.{List => JList, Locale}
-
-import scala.collection.JavaConverters._
-import scala.collection.mutable.ArrayBuffer
-
-import org.apache.commons.lang3.StringUtils
-import org.apache.commons.logging.LogFactory
-import org.apache.hadoop.conf.Configuration
-import org.apache.hadoop.fs.{FileSystem, Path}
-import org.apache.hadoop.fs.permission.FsAction
-import org.apache.hadoop.hive.common.FileUtils
-import org.apache.hadoop.hive.ql.security.authorization.plugin.HiveAccessControlException
-import org.apache.hadoop.security.UserGroupInformation
-import org.apache.ranger.authorization.utils.StringUtil
-import org.apache.ranger.plugin.policyengine.RangerAccessRequest
-import org.apache.ranger.plugin.util.RangerPerfTracer
-import org.apache.spark.sql.SparkSession
-
-import org.apache.submarine.spark.security.SparkAccessType.SparkAccessType
-import org.apache.submarine.spark.security.SparkObjectType.SparkObjectType
-import org.apache.submarine.spark.security.SparkOperationType.SparkOperationType
-
-object RangerSparkAuthorizer {
-  private val LOG = LogFactory.getLog(this.getClass.getSimpleName.stripSuffix("$"))
-
-  private def currentUser: UserGroupInformation = UserGroupInformation.getCurrentUser
-
-  def checkPrivileges(
-      spark: SparkSession,
-      opType: SparkOperationType,
-      inputs: Seq[SparkPrivilegeObject],
-      outputs: Seq[SparkPrivilegeObject]): Unit = {
-
-    val ugi = currentUser
-    val user = ugi.getShortUserName
-    val groups = ugi.getGroupNames.toSet
-    val auditHandler = new RangerSparkAuditHandler
-    val perf = if (RangerPerfTracer.isPerfTraceEnabled(PERF_SPARKAUTH_REQUEST_LOG)) {
-      RangerPerfTracer.getPerfTracer(PERF_SPARKAUTH_REQUEST_LOG,
-        "RangerSparkAuthorizer.checkPrivileges()")
-    } else {
-      null
-    }
-    try {
-      val requests = new ArrayBuffer[RangerSparkAccessRequest]()
-      if (inputs.isEmpty && opType == SparkOperationType.SHOWDATABASES) {
-        val resource = new RangerSparkResource(SparkObjectType.DATABASE, None)
-        requests += new RangerSparkAccessRequest(resource, user, groups, opType.toString,
-          SparkAccessType.USE, RangerSparkPlugin.getClusterName)
-      }
-
-      def addAccessRequest(objs: Seq[SparkPrivilegeObject], isInput: Boolean): Unit = {
-        objs.foreach { obj =>
-          val resource = getSparkResource(obj, opType)
-          if (resource != null) {
-            val objectName = obj.getObjectName
-            val objectType = resource.getObjectType
-            if (objectType == SparkObjectType.URI && isPathInFSScheme(objectName)) {
-              val fsAction = getURIAccessType(opType)
-              val hadoopConf = spark.sparkContext.hadoopConfiguration
-              if (!canAccessURI(user, fsAction, objectName, hadoopConf)) {
-                throw new HiveAccessControlException(s"Permission denied: user [$user] does not" +
-                  s" have [${fsAction.name}] privilege on [$objectName]")
-              }
-            } else {
-              val accessType = getAccessType(obj, opType, objectType, isInput)
-              if (accessType != SparkAccessType.NONE && !requests.exists(
-                o => o.getSparkAccessType == accessType && o.getResource == resource)) {
-                requests += new RangerSparkAccessRequest(resource, user, groups, opType.toString,
-                  accessType, RangerSparkPlugin.getClusterName)
-              }
-            }
-          }
-        }
-      }
-
-      addAccessRequest(inputs, isInput = true)
-      addAccessRequest(outputs, isInput = false)
-      requests.foreach { request =>
-        val resource = request.getResource.asInstanceOf[RangerSparkResource]
-        if (resource.getObjectType == SparkObjectType.COLUMN &&
-          StringUtils.contains(resource.getColumn, ",")) {
-          resource.setServiceDef(RangerSparkPlugin.getServiceDef)
-          val colReqs: JList[RangerAccessRequest] = resource.getColumn.split(",")
-            .filter(StringUtils.isNotBlank).map { c =>
-            val colRes = new RangerSparkResource(SparkObjectType.COLUMN,
-              Option(resource.getDatabase), resource.getTable, c)
-            val colReq = request.copy()
-            colReq.setResource(colRes)
-            colReq.asInstanceOf[RangerAccessRequest]
-          }.toList.asJava
-          val colResults = RangerSparkPlugin.isAccessAllowed(colReqs, auditHandler)
-          if (colResults != null) {
-            for (c <- colResults.asScala) {
-              if (c != null && !c.getIsAllowed) {
-                throw new SparkAccessControlException(s"Permission denied: user [$user] does not" +
-                  s" have [${request.getSparkAccessType}] privilege on [${resource.getAsString}]")
-              }
-            }
-          }
-        } else {
-          val result = RangerSparkPlugin.isAccessAllowed(request, auditHandler)
-          if (result != null && !result.getIsAllowed) {
-            throw new SparkAccessControlException(s"Permission denied: user [$user] does not" +
-              s" have [${request.getSparkAccessType}] privilege on [${resource.getAsString}]")
-          }
-        }
-      }
-    } finally {
-      // TODO(Kent Yao) add auditHandler.flush()
-      RangerPerfTracer.log(perf)
-    }
-  }
-
-  def isAllowed(obj: SparkPrivilegeObject): Boolean = {
-    val ugi = currentUser
-    val user = ugi.getShortUserName
-    val groups = ugi.getGroupNames.toSet
-    createSparkResource(obj) match {
-      case Some(resource) =>
-        val request =
-          new RangerSparkAccessRequest(resource, user, groups, RangerSparkPlugin.getClusterName)
-        val result = RangerSparkPlugin.isAccessAllowed(request)
-        if (request == null) {
-          LOG.error("Internal error: null RangerAccessResult received back from isAccessAllowed")
-          false
-        } else if (!result.getIsAllowed) {
-          if (LOG.isDebugEnabled) {
-            val path = resource.getAsString
-            LOG.debug(s"Permission denied: user [$user] does not have" +
-              s" [${request.getSparkAccessType}] privilege on [$path]. resource[$resource]," +
-              s" request[$request], result[$result]")
-          }
-          false
-        } else {
-          true
-        }
-      case _ =>
-        LOG.error("RangerSparkResource returned by createSparkResource is null")
-        false
-    }
-
-  }
-
-  private val PERF_SPARKAUTH_REQUEST_LOG = RangerPerfTracer.getPerfLogger("sparkauth.request")
-
-  def createSparkResource(privilegeObject: SparkPrivilegeObject): Option[RangerSparkResource] = {
-    val objectName = privilegeObject.getObjectName
-    val dbName = privilegeObject.getDbname
-    val objectType = privilegeObject.getType
-    objectType match {
-      case SparkPrivilegeObjectType.DATABASE =>
-        Some(RangerSparkResource(SparkObjectType.DATABASE, Option(objectName)))
-      case SparkPrivilegeObjectType.TABLE_OR_VIEW =>
-        Some(RangerSparkResource(SparkObjectType.DATABASE, Option(dbName), objectName))
-      case _ =>
-        LOG.warn(s"RangerSparkAuthorizer.createSparkResource: unexpected objectType: $objectType")
-        None
-    }
-  }
-
-  private def getAccessType(
-      obj: SparkPrivilegeObject,
-      opType: SparkOperationType,
-      objectType: SparkObjectType,
-      isInput: Boolean): SparkAccessType = {
-    objectType match {
-      case SparkObjectType.URI if isInput => SparkAccessType.READ
-      case SparkObjectType.URI => SparkAccessType.WRITE
-      case _ => obj.getActionType match {
-        case SparkPrivObjectActionType.INSERT | SparkPrivObjectActionType.INSERT_OVERWRITE =>
-          SparkAccessType.UPDATE
-        case SparkPrivObjectActionType.OTHER =>
-          import org.apache.submarine.spark.security.SparkOperationType._
-          opType match {
-            case CREATEDATABASE if obj.getType == SparkPrivilegeObjectType.DATABASE =>
-              SparkAccessType.CREATE
-            case CREATEFUNCTION if obj.getType == SparkPrivilegeObjectType.FUNCTION =>
-              SparkAccessType.CREATE
-            case CREATETABLE | CREATEVIEW | CREATETABLE_AS_SELECT
-              if obj.getType == SparkPrivilegeObjectType.TABLE_OR_VIEW =>
-              if (isInput) SparkAccessType.SELECT else SparkAccessType.CREATE
-            case ALTERDATABASE | ALTERTABLE_ADDCOLS |
-                 ALTERTABLE_ADDPARTS | ALTERTABLE_DROPPARTS |
-                 ALTERTABLE_LOCATION | ALTERTABLE_PROPERTIES | ALTERTABLE_SERDEPROPERTIES |
-                 ALTERVIEW_RENAME | MSCK => SparkAccessType.ALTER
-            case DROPFUNCTION | DROPTABLE | DROPVIEW | DROPDATABASE =>
-              SparkAccessType.DROP
-            case LOAD => if (isInput) SparkAccessType.SELECT else SparkAccessType.UPDATE
-            case QUERY | SHOW_CREATETABLE | SHOWPARTITIONS |
-                 SHOW_TBLPROPERTIES => SparkAccessType.SELECT
-            case SHOWCOLUMNS | DESCTABLE =>
-              StringUtil.toLower(RangerSparkPlugin.showColumnsOption) match {
-                case "show-all" => SparkAccessType.USE
-                case _ => SparkAccessType.SELECT
-              }
-            case SHOWDATABASES | SWITCHDATABASE | DESCDATABASE| SHOWTABLES => SparkAccessType.USE
-            case TRUNCATETABLE => SparkAccessType.UPDATE
-            case _ => SparkAccessType.NONE
-          }
-      }
-    }
-  }
-
-  private def getObjectType(
-      obj: SparkPrivilegeObject,
-      opType: SparkOperationType): SparkObjectType = {
-    obj.getType match {
-      case SparkPrivilegeObjectType.DATABASE | null => SparkObjectType.DATABASE
-      case SparkPrivilegeObjectType.TABLE_OR_VIEW if !StringUtil.isEmpty(obj.getColumns.asJava) =>
-        SparkObjectType.COLUMN
-      case SparkPrivilegeObjectType.TABLE_OR_VIEW
-          if opType.toString.toLowerCase(Locale.ROOT).contains("view") =>
-        SparkObjectType.VIEW
-      case SparkPrivilegeObjectType.TABLE_OR_VIEW => SparkObjectType.TABLE
-      case SparkPrivilegeObjectType.FUNCTION => SparkObjectType.FUNCTION
-      case SparkPrivilegeObjectType.DFS_URI => SparkObjectType.URI
-      case _ => SparkObjectType.NONE
-    }
-  }
-
-  private def getSparkResource(
-      obj: SparkPrivilegeObject,
-      opType: SparkOperationType): RangerSparkResource = {
-    import org.apache.submarine.spark.security.SparkObjectType._
-    val objectType = getObjectType(obj, opType)
-    val resource = objectType match {
-      case DATABASE => RangerSparkResource(objectType, Option(obj.getDbname))
-      case TABLE | VIEW | FUNCTION =>
-        RangerSparkResource(objectType, Option(obj.getDbname), obj.getObjectName)
-      case COLUMN =>
-        RangerSparkResource(objectType, Option(obj.getDbname), obj.getObjectName,
-          obj.getColumns.mkString(","))
-      case _ => null
-    }
-    if (resource != null) resource.setServiceDef(RangerSparkPlugin.getServiceDef)
-    resource
-  }
-
-  private def canAccessURI(
-      user: String,
-      action: FsAction,
-      uri: String,
-      conf: Configuration): Boolean = action match {
-    case FsAction.NONE => true
-    case _ =>
-      try {
-        val filePath = new Path(uri)
-        val fs = FileSystem.get(filePath.toUri, conf)
-        val fileStat = fs.globStatus(filePath)
-        if (fileStat != null && fileStat.nonEmpty) fileStat.forall { file =>
-          FileUtils.isOwnerOfFileHierarchy(fs, file, user) ||
-            FileUtils.isActionPermittedForFileHierarchy(fs, file, user, action)
-        } else {
-          val file = FileUtils.getPathOrParentThatExists(fs, filePath)
-          FileUtils.checkFileAccessWithImpersonation(fs, file, action, user)
-          true
-        }
-      } catch {
-        case e: Exception =>
-          LOG.error("Error getting permissions for " + uri, e)
-          false
-      }
-  }
-
-  private def getURIAccessType(operationType: SparkOperationType): FsAction = {
-    import org.apache.submarine.spark.security.SparkOperationType._
-
-    operationType match {
-      case LOAD => FsAction.READ
-      case CREATEDATABASE | CREATETABLE | CREATETABLE_AS_SELECT | ALTERDATABASE |
-           ALTERTABLE_ADDCOLS | ALTERTABLE_RENAMECOL | ALTERTABLE_RENAMEPART | ALTERTABLE_RENAME |
-           ALTERTABLE_DROPPARTS | ALTERTABLE_ADDPARTS | ALTERTABLE_PROPERTIES |
-           ALTERTABLE_SERDEPROPERTIES | ALTERTABLE_LOCATION | QUERY => FsAction.ALL
-      case _ => FsAction.NONE
-    }
-  }
-
-  private def isPathInFSScheme(objectName: String): Boolean = {
-    objectName.nonEmpty && RangerSparkPlugin.fsScheme.exists(objectName.startsWith)
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkResource.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkResource.scala
deleted file mode 100644
index ef28747..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/RangerSparkResource.scala
+++ /dev/null
@@ -1,93 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import org.apache.ranger.plugin.policyengine.RangerAccessResourceImpl
-
-import org.apache.submarine.spark.security.SparkObjectType.SparkObjectType
-
-class RangerSparkResource(
-    objectType: SparkObjectType,
-    databaseOrUrl: Option[String],
-    tableOrUdf: String,
-    column: String) extends RangerAccessResourceImpl {
-  import SparkObjectType._
-  import RangerSparkResource._
-
-  def this(objectType: SparkObjectType, databaseOrUrl: Option[String], tableOrUdf: String) = {
-    this(objectType, databaseOrUrl, tableOrUdf, null)
-  }
-
-  def this(objectType: SparkObjectType, databaseOrUrl: Option[String]) = {
-    this(objectType, databaseOrUrl, null)
-  }
-
-  objectType match {
-    case DATABASE => setValue(KEY_DATABASE, databaseOrUrl.getOrElse("*"))
-    case FUNCTION =>
-      setValue(KEY_DATABASE, databaseOrUrl.getOrElse(""))
-      setValue(KEY_UDF, tableOrUdf)
-    case COLUMN =>
-      setValue(KEY_DATABASE, databaseOrUrl.getOrElse("*"))
-      setValue(KEY_TABLE, tableOrUdf)
-      setValue(KEY_COLUMN, column)
-    case TABLE | VIEW =>
-      setValue(KEY_DATABASE, databaseOrUrl.getOrElse("*"))
-      setValue(KEY_TABLE, tableOrUdf)
-    case URI => setValue(KEY_URL, databaseOrUrl.getOrElse("*"))
-    case _ =>
-  }
-
-  def getObjectType: SparkObjectType = objectType
-
-  def getDatabase: String = getValue(KEY_DATABASE).asInstanceOf[String]
-
-  def getTable: String = getValue(KEY_TABLE).asInstanceOf[String]
-
-  def getUdf: String = getValue(KEY_UDF).asInstanceOf[String]
-
-  def getColumn: String = getValue(KEY_COLUMN).asInstanceOf[String]
-
-  def getUrl: String = getValue(KEY_URL).asInstanceOf[String]
-
-}
-
-object RangerSparkResource {
-
-  def apply(objectType: SparkObjectType, databaseOrUrl: Option[String], tableOrUdf: String,
-            column: String): RangerSparkResource = {
-    new RangerSparkResource(objectType, databaseOrUrl, tableOrUdf, column)
-  }
-
-  def apply(objectType: SparkObjectType, databaseOrUrl: Option[String],
-            tableOrUdf: String): RangerSparkResource = {
-    new RangerSparkResource(objectType, databaseOrUrl, tableOrUdf)
-  }
-
-  def apply(objectType: SparkObjectType, databaseOrUrl: Option[String]): RangerSparkResource = {
-    new RangerSparkResource(objectType, databaseOrUrl)
-  }
-
-  private val KEY_DATABASE = "database"
-  private val KEY_TABLE = "table"
-  private val KEY_UDF = "udf"
-  private val KEY_COLUMN = "column"
-  private val KEY_URL = "url"
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkAccessControlException.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkAccessControlException.scala
deleted file mode 100644
index b5923d5..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkAccessControlException.scala
+++ /dev/null
@@ -1,27 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-class SparkAccessControlException(msg: String, e: Throwable) extends Exception(msg, e) {
-
-  def this(msg: String) = {
-    this(msg, null)
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkAccessType.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkAccessType.scala
deleted file mode 100644
index abe9636..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkAccessType.scala
+++ /dev/null
@@ -1,37 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-object SparkAccessType extends Enumeration {
-
-  type SparkAccessType = Value
-
-  val NONE,
-      CREATE,
-      ALTER,
-      DROP,
-      SELECT,
-      UPDATE,
-      USE,
-      READ,
-      WRITE,
-      ALL,
-      ADMIN = Value
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkObjectType.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkObjectType.scala
deleted file mode 100644
index 8374fd9..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkObjectType.scala
+++ /dev/null
@@ -1,32 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-object SparkObjectType extends Enumeration {
-  type SparkObjectType = Value
-
-  val NONE,
-      DATABASE,
-      TABLE,
-      VIEW,
-      COLUMN,
-      FUNCTION,
-      URI = Value
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkOperationType.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkOperationType.scala
deleted file mode 100644
index 8cc8d7e..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkOperationType.scala
+++ /dev/null
@@ -1,63 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-object SparkOperationType extends Enumeration {
-
-  type SparkOperationType = Value
-
-  val ALTERDATABASE,
-      ALTERTABLE_ADDCOLS,
-      ALTERTABLE_ADDPARTS,
-      ALTERTABLE_RENAMECOL,
-      ALTERTABLE_DROPPARTS,
-      ALTERTABLE_RENAMEPART,
-      ALTERTABLE_RENAME,
-      ALTERTABLE_PROPERTIES,
-      ALTERTABLE_SERDEPROPERTIES,
-      ALTERTABLE_LOCATION,
-      ALTERVIEW_RENAME,
-      CREATEDATABASE,
-      CREATETABLE,
-      CREATETABLE_AS_SELECT,
-      CREATEFUNCTION,
-      CREATEVIEW,
-      DESCDATABASE,
-      DESCFUNCTION,
-      DESCTABLE,
-      DROPDATABASE,
-      DROPFUNCTION,
-      DROPTABLE,
-      DROPVIEW,
-      EXPLAIN,
-      LOAD,
-      MSCK,
-      QUERY,
-      SHOWCONF,
-      SHOW_CREATETABLE,
-      SHOWCOLUMNS,
-      SHOWDATABASES,
-      SHOWFUNCTIONS,
-      SHOWPARTITIONS,
-      SHOWTABLES,
-      SHOW_TBLPROPERTIES,
-      SWITCHDATABASE,
-      TRUNCATETABLE = Value
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkPrivObjectActionType.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkPrivObjectActionType.scala
deleted file mode 100644
index 155279c..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkPrivObjectActionType.scala
+++ /dev/null
@@ -1,28 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-object SparkPrivObjectActionType extends Enumeration {
-  type SparkPrivObjectActionType = Value
-
-  val OTHER,
-      INSERT,
-      INSERT_OVERWRITE = Value
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkPrivilegeObject.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkPrivilegeObject.scala
deleted file mode 100644
index fb45f5a..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkPrivilegeObject.scala
+++ /dev/null
@@ -1,138 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-import scala.collection.JavaConverters._
-
-import org.apache.submarine.spark.security.SparkPrivilegeObjectType.SparkPrivilegeObjectType
-import org.apache.submarine.spark.security.SparkPrivObjectActionType.SparkPrivObjectActionType
-
-class SparkPrivilegeObject(
-    private val typ: SparkPrivilegeObjectType,
-    private val dbname: String,
-    private val objectName: String,
-    private val partKeys: Seq[String],
-    private val columns: Seq[String],
-    private val actionType: SparkPrivObjectActionType)
-  extends Ordered[SparkPrivilegeObject] {
-
-  override def compare(that: SparkPrivilegeObject): Int = {
-    typ compareTo that.typ match {
-      case 0 =>
-        compare(dbname, that.dbname) match {
-          case 0 =>
-            compare(objectName, that.objectName) match {
-              case 0 =>
-                compare(partKeys, that.partKeys) match {
-                  case 0 => compare(columns, that.columns)
-                  case o => o
-                }
-              case o => o
-            }
-          case o => o
-        }
-      case o => o
-    }
-  }
-
-  private def compare(o1: String, o2: String): Int = {
-    if (o1 != null) {
-      if (o2 != null) o1.compareTo(o2) else 1
-    } else {
-      if (o2 != null) -1 else 0
-    }
-  }
-
-  private def compare(o1: Seq[String], o2: Seq[String]): Int = {
-    if (o1 != null) {
-      if (o2 != null) {
-        for ((x, y) <- o1.zip(o2)) {
-          val ret = compare(x, y)
-          if (ret != 0) {
-            return ret
-          }
-        }
-        if (o1.size > o2.size) {
-          1
-        } else if (o1.size < o2.size) {
-          -1
-        } else {
-          0
-        }
-      } else {
-        1
-      }
-    } else {
-      if (o2 != null) {
-        -1
-      } else {
-        0
-      }
-    }
-  }
-
-  def this(typ: SparkPrivilegeObjectType, dbname: String, objectName: String,
-           partKeys: Seq[String], columns: Seq[String]) =
-    this(typ, dbname, objectName, partKeys, columns, SparkPrivObjectActionType.OTHER)
-
-  def this(typ: SparkPrivilegeObjectType, dbname: String, objectName: String,
-           actionType: SparkPrivObjectActionType) =
-    this(typ, dbname, objectName, Nil, Nil, actionType)
-
-  def this(typ: SparkPrivilegeObjectType, dbname: String, objectName: String) =
-    this(typ, dbname, objectName, SparkPrivObjectActionType.OTHER)
-
-  def getType: SparkPrivilegeObjectType = typ
-
-  def getDbname: String = dbname
-
-  def getObjectName: String = objectName
-
-  def getActionType: SparkPrivObjectActionType = actionType
-
-  def getPartKeys: Seq[String] = partKeys
-
-  def getColumns: Seq[String] = columns
-
-  override def toString: String = {
-    val name = typ match {
-      case SparkPrivilegeObjectType.DATABASE => dbname
-      case SparkPrivilegeObjectType.TABLE_OR_VIEW =>
-        getDbObjectName + (if (partKeys != null) partKeys.asJava.toString else "")
-      case SparkPrivilegeObjectType.FUNCTION => getDbObjectName
-      case _ => ""
-    }
-
-    val at = if (actionType != null) {
-      actionType match {
-        case SparkPrivObjectActionType.INSERT |
-             SparkPrivObjectActionType.INSERT_OVERWRITE => ", action=" + actionType
-        case _ => ""
-      }
-    } else {
-      ""
-    }
-    "Object [type=" + typ + ", name=" + name + at + "]"
-  }
-
-  private def getDbObjectName: String = {
-    (if (dbname == null) "" else dbname + ".") + objectName
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkPrivilegeObjectType.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkPrivilegeObjectType.scala
deleted file mode 100644
index 97bc883..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/SparkPrivilegeObjectType.scala
+++ /dev/null
@@ -1,29 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security
-
-object SparkPrivilegeObjectType extends Enumeration {
-  type SparkPrivilegeObjectType = Value
-
-  val DATABASE,
-      TABLE_OR_VIEW,
-      FUNCTION,
-      DFS_URI = Value
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/api/RangerSparkAuthzExtension.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/api/RangerSparkAuthzExtension.scala
deleted file mode 100644
index 5d0b34c..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/api/RangerSparkAuthzExtension.scala
+++ /dev/null
@@ -1,42 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.api
-
-import org.apache.spark.sql.SparkSessionExtensions
-import org.apache.spark.sql.catalyst.optimizer.{SubmarineConfigurationCheckExtension, SubmarineSparkRangerAuthorizationExtension}
-
-import org.apache.submarine.spark.security.Extensions
-
-/**
- * ACL Management for Apache Spark SQL with Apache Ranger, enabling:
- * <ul>
- *   <li>Table/Column level authorization</li>
- * <ul>
- *
- * To work with Spark SQL, we need to enable it via spark extensions
- *
- * spark.sql.extensions=org.apache.submarine.spark.security.api.RangerSparkAuthzExtension
- */
-class RangerSparkAuthzExtension extends Extensions {
-  override def apply(ext: SparkSessionExtensions): Unit = {
-    ext.injectCheckRule(SubmarineConfigurationCheckExtension)
-    ext.injectOptimizerRule(SubmarineSparkRangerAuthorizationExtension)
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/api/RangerSparkDCLExtension.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/api/RangerSparkDCLExtension.scala
deleted file mode 100644
index adec7c8..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/api/RangerSparkDCLExtension.scala
+++ /dev/null
@@ -1,62 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.api
-
-import org.apache.spark.sql.SparkSessionExtensions
-
-import org.apache.submarine.spark.security.Extensions
-import org.apache.submarine.spark.security.parser.SubmarineSqlParser
-
-/**
- * An extension for Spark SQL to activate DCL(Data Control Language)
- *
- * Scala example to create a `SparkSession` with the Submarine DCL parser::
- * {{{
- *    import org.apache.spark.sql.SparkSession
- *
- *    val spark = SparkSession
- *       .builder()
- *       .appName("...")
- *       .master("...")
- *       .config("spark.sql.extensions",
- *         "org.apache.submarine.spark.security.api.RangerSparkDCLExtension")
- *       .getOrCreate()
- * }}}
- *
- * Java example to create a `SparkSession` with the Submarine DCL parser:
- * {{{
- *    import org.apache.spark.sql.SparkSession;
- *
- *    SparkSession spark = SparkSession
- *                 .builder()
- *                 .appName("...")
- *                 .master("...")
- *                 .config("spark.sql.extensions",
- *                     "org.apache.submarine.spark.security.api.RangerSparkDCLExtension")
- *                 .getOrCreate();
- * }}}
- *
- * @since 0.4.0
- */
-class RangerSparkDCLExtension extends Extensions {
-  override def apply(ext: SparkSessionExtensions): Unit = {
-    ext.injectParser((_, parser) => new SubmarineSqlParser(parser))
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/api/RangerSparkSQLExtension.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/api/RangerSparkSQLExtension.scala
deleted file mode 100644
index efa286e..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/api/RangerSparkSQLExtension.scala
+++ /dev/null
@@ -1,49 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.api
-
-import org.apache.spark.sql.SparkSessionExtensions
-import org.apache.spark.sql.catalyst.optimizer.{SubmarineConfigurationCheckExtension, SubmarineDataMaskingExtension, SubmarinePushPredicatesThroughExtensions, SubmarineRowFilterExtension, SubmarineSparkRangerAuthorizationExtension}
-import org.apache.spark.sql.execution.SubmarineSparkPlanOmitStrategy
-
-import org.apache.submarine.spark.security.Extensions
-
-/**
- * ACL Management for Apache Spark SQL with Apache Ranger, enabling:
- * <ul>
- *   <li>Table/Column level authorization</li>
- *   <li>Row level filtering</li>
- *   <li>Data masking</li>
- * <ul>
- *
- * To work with Spark SQL, we need to enable it via spark extensions
- *
- * spark.sql.extensions=org.apache.submarine.spark.security.api.RangerSparkSQLExtension
- */
-class RangerSparkSQLExtension extends Extensions {
-  override def apply(ext: SparkSessionExtensions): Unit = {
-    ext.injectCheckRule(SubmarineConfigurationCheckExtension)
-    ext.injectOptimizerRule(SubmarineSparkRangerAuthorizationExtension)
-    ext.injectOptimizerRule(SubmarineRowFilterExtension)
-    ext.injectOptimizerRule(SubmarineDataMaskingExtension)
-    ext.injectOptimizerRule(SubmarinePushPredicatesThroughExtensions)
-    ext.injectPlannerStrategy(SubmarineSparkPlanOmitStrategy)
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/package.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/package.scala
deleted file mode 100644
index d777276..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/package.scala
+++ /dev/null
@@ -1,28 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark
-
-import org.apache.spark.sql.SparkSessionExtensions
-
-package object security {
-
-  type Extensions = SparkSessionExtensions => Unit
-
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/parser/SubmarineSqlAstBuilder.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/parser/SubmarineSqlAstBuilder.scala
deleted file mode 100644
index 01c64c1..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/parser/SubmarineSqlAstBuilder.scala
+++ /dev/null
@@ -1,48 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.parser
-
-import org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
-
-import org.apache.submarine.spark.security.command.{CreateRoleCommand, DropRoleCommand, ShowCurrentRolesCommand, ShowRolesCommand}
-import org.apache.submarine.spark.security.parser.SubmarineSqlBaseParser.{CreateRoleContext, DropRoleContext, ShowCurrentRolesContext, ShowRolesContext, SingleStatementContext}
-
-class SubmarineSqlAstBuilder extends SubmarineSqlBaseBaseVisitor[AnyRef] {
-
-  override def visitSingleStatement(ctx: SingleStatementContext): LogicalPlan = {
-    visit(ctx.statement()).asInstanceOf[LogicalPlan]
-  }
-
-  override def visitCreateRole(ctx: CreateRoleContext): AnyRef = {
-    CreateRoleCommand(ctx.identifier().getText)
-  }
-
-  override def visitDropRole(ctx: DropRoleContext): AnyRef = {
-    DropRoleCommand(ctx.identifier().getText)
-  }
-
-  override def visitShowRoles(ctx: ShowRolesContext): AnyRef = {
-    ShowRolesCommand()
-  }
-
-  override def visitShowCurrentRoles(ctx: ShowCurrentRolesContext): AnyRef = {
-    ShowCurrentRolesCommand()
-  }
-}
diff --git a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/parser/UpperCaseCharStream.scala b/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/parser/UpperCaseCharStream.scala
deleted file mode 100644
index 42424b9..0000000
--- a/submarine-security/spark-security/src/main/scala/org/apache/submarine/spark/security/parser/UpperCaseCharStream.scala
+++ /dev/null
@@ -1,59 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *   http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.apache.submarine.spark.security.parser
-
-import org.antlr.v4.runtime.{CharStream, CodePointCharStream, IntStream}
-import org.antlr.v4.runtime.misc.Interval
-
-// scalastyle:off line.size.limit
-/**
- * Adopted from Apache Spark project
- * @see https://github.com/apache/spark/blob/v2.4.4/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/parser/ParseDriver.scala#L157
- */
-// scalastyle:on line.size.limit
-private[parser] class UpperCaseCharStream(wrapped: CodePointCharStream) extends CharStream {
-  override def consume(): Unit = wrapped.consume()
-  override def getSourceName(): String = wrapped.getSourceName
-  override def index(): Int = wrapped.index
-  override def mark(): Int = wrapped.mark
-  override def release(marker: Int): Unit = wrapped.release(marker)
-  override def seek(where: Int): Unit = wrapped.seek(where)
-  override def size(): Int = wrapped.size
-
-  override def getText(interval: Interval): String = {
-    // ANTLR 4.7's CodePointCharStream implementations have bugs when
-    // getText() is called with an empty stream, or intervals where
-    // the start > end. See
-    // https://github.com/antlr/antlr4/commit/ac9f7530 for one fix
-    // that is not yet in a released ANTLR artifact.
-    if (size() > 0 && (interval.b - interval.a >= 0)) {
-      wrapped.getText(interval)
-    } else {
-      ""
-    }
-  }
-
-  // scalastyle:off
-  override def LA(i: Int): Int = {
-    val la = wrapped.LA(i)
-    if (la == 0 || la == IntStream.EOF) la else Character.toUpperCase(la)
-  }
-  // scalastyle:on
-}
diff --git a/submarine-security/spark-security/src/test/resources/data/files/kv1.txt b/submarine-security/spark-security/src/test/resources/data/files/kv1.txt
deleted file mode 100644
index 9825414..0000000
--- a/submarine-security/spark-security/src/test/resources/data/files/kv1.txt
+++ /dev/null
@@ -1,500 +0,0 @@
-238val_238
-86val_86
-311val_311
-27val_27
-165val_165
-409val_409
-255val_255
-278val_278
-98val_98
-484val_484
-265val_265
-193val_193
-401val_401
-150val_150
-273val_273
-224val_224
-369val_369
-66val_66
-128val_128
-213val_213
-146val_146
-406val_406
-429val_429
-374val_374
-152val_152
-469val_469
-145val_145
-495val_495
-37val_37
-327val_327
-281val_281
-277val_277
-209val_209
-15val_15
-82val_82
-403val_403
-166val_166
-417val_417
-430val_430
-252val_252
-292val_292
-219val_219
-287val_287
-153val_153
-193val_193
-338val_338
-446val_446
-459val_459
-394val_394
-237val_237
-482val_482
-174val_174
-413val_413
-494val_494
-207val_207
-199val_199
-466val_466
-208val_208
-174val_174
-399val_399
-396val_396
-247val_247
-417val_417
-489val_489
-162val_162
-377val_377
-397val_397
-309val_309
-365val_365
-266val_266
-439val_439
-342val_342
-367val_367
-325val_325
-167val_167
-195val_195
-475val_475
-17val_17
-113val_113
-155val_155
-203val_203
-339val_339
-0val_0
-455val_455
-128val_128
-311val_311
-316val_316
-57val_57
-302val_302
-205val_205
-149val_149
-438val_438
-345val_345
-129val_129
-170val_170
-20val_20
-489val_489
-157val_157
-378val_378
-221val_221
-92val_92
-111val_111
-47val_47
-72val_72
-4val_4
-280val_280
-35val_35
-427val_427
-277val_277
-208val_208
-356val_356
-399val_399
-169val_169
-382val_382
-498val_498
-125val_125
-386val_386
-437val_437
-469val_469
-192val_192
-286val_286
-187val_187
-176val_176
-54val_54
-459val_459
-51val_51
-138val_138
-103val_103
-239val_239
-213val_213
-216val_216
-430val_430
-278val_278
-176val_176
-289val_289
-221val_221
-65val_65
-318val_318
-332val_332
-311val_311
-275val_275
-137val_137
-241val_241
-83val_83
-333val_333
-180val_180
-284val_284
-12val_12
-230val_230
-181val_181
-67val_67
-260val_260
-404val_404
-384val_384
-489val_489
-353val_353
-373val_373
-272val_272
-138val_138
-217val_217
-84val_84
-348val_348
-466val_466
-58val_58
-8val_8
-411val_411
-230val_230
-208val_208
-348val_348
-24val_24
-463val_463
-431val_431
-179val_179
-172val_172
-42val_42
-129val_129
-158val_158
-119val_119
-496val_496
-0val_0
-322val_322
-197val_197
-468val_468
-393val_393
-454val_454
-100val_100
-298val_298
-199val_199
-191val_191
-418val_418
-96val_96
-26val_26
-165val_165
-327val_327
-230val_230
-205val_205
-120val_120
-131val_131
-51val_51
-404val_404
-43val_43
-436val_436
-156val_156
-469val_469
-468val_468
-308val_308
-95val_95
-196val_196
-288val_288
-481val_481
-457val_457
-98val_98
-282val_282
-197val_197
-187val_187
-318val_318
-318val_318
-409val_409
-470val_470
-137val_137
-369val_369
-316val_316
-169val_169
-413val_413
-85val_85
-77val_77
-0val_0
-490val_490
-87val_87
-364val_364
-179val_179
-118val_118
-134val_134
-395val_395
-282val_282
-138val_138
-238val_238
-419val_419
-15val_15
-118val_118
-72val_72
-90val_90
-307val_307
-19val_19
-435val_435
-10val_10
-277val_277
-273val_273
-306val_306
-224val_224
-309val_309
-389val_389
-327val_327
-242val_242
-369val_369
-392val_392
-272val_272
-331val_331
-401val_401
-242val_242
-452val_452
-177val_177
-226val_226
-5val_5
-497val_497
-402val_402
-396val_396
-317val_317
-395val_395
-58val_58
-35val_35
-336val_336
-95val_95
-11val_11
-168val_168
-34val_34
-229val_229
-233val_233
-143val_143
-472val_472
-322val_322
-498val_498
-160val_160
-195val_195
-42val_42
-321val_321
-430val_430
-119val_119
-489val_489
-458val_458
-78val_78
-76val_76
-41val_41
-223val_223
-492val_492
-149val_149
-449val_449
-218val_218
-228val_228
-138val_138
-453val_453
-30val_30
-209val_209
-64val_64
-468val_468
-76val_76
-74val_74
-342val_342
-69val_69
-230val_230
-33val_33
-368val_368
-103val_103
-296val_296
-113val_113
-216val_216
-367val_367
-344val_344
-167val_167
-274val_274
-219val_219
-239val_239
-485val_485
-116val_116
-223val_223
-256val_256
-263val_263
-70val_70
-487val_487
-480val_480
-401val_401
-288val_288
-191val_191
-5val_5
-244val_244
-438val_438
-128val_128
-467val_467
-432val_432
-202val_202
-316val_316
-229val_229
-469val_469
-463val_463
-280val_280
-2val_2
-35val_35
-283val_283
-331val_331
-235val_235
-80val_80
-44val_44
-193val_193
-321val_321
-335val_335
-104val_104
-466val_466
-366val_366
-175val_175
-403val_403
-483val_483
-53val_53
-105val_105
-257val_257
-406val_406
-409val_409
-190val_190
-406val_406
-401val_401
-114val_114
-258val_258
-90val_90
-203val_203
-262val_262
-348val_348
-424val_424
-12val_12
-396val_396
-201val_201
-217val_217
-164val_164
-431val_431
-454val_454
-478val_478
-298val_298
-125val_125
-431val_431
-164val_164
-424val_424
-187val_187
-382val_382
-5val_5
-70val_70
-397val_397
-480val_480
-291val_291
-24val_24
-351val_351
-255val_255
-104val_104
-70val_70
-163val_163
-438val_438
-119val_119
-414val_414
-200val_200
-491val_491
-237val_237
-439val_439
-360val_360
-248val_248
-479val_479
-305val_305
-417val_417
-199val_199
-444val_444
-120val_120
-429val_429
-169val_169
-443val_443
-323val_323
-325val_325
-277val_277
-230val_230
-478val_478
-178val_178
-468val_468
-310val_310
-317val_317
-333val_333
-493val_493
-460val_460
-207val_207
-249val_249
-265val_265
-480val_480
-83val_83
-136val_136
-353val_353
-172val_172
-214val_214
-462val_462
-233val_233
-406val_406
-133val_133
-175val_175
-189val_189
-454val_454
-375val_375
-401val_401
-421val_421
-407val_407
-384val_384
-256val_256
-26val_26
-134val_134
-67val_67
-384val_384
-379val_379
-18val_18
-462val_462
-492val_492
-100val_100
-298val_298
-9val_9
-341val_341
-498val_498
-146val_146
-458val_458
-362val_362
-186val_186
-285val_285
-348val_348
-167val_167
-18val_18
-273val_273
-183val_183
-281val_281
-344val_344
-97val_97
-469val_469
-315val_315
-84val_84
-28val_28
-37val_37
-448val_448
-152val_152
-348val_348
-307val_307
-194val_194
-414val_414
-477val_477
-222val_222
-126val_126
-90val_90
-169val_169
-403val_403
-400val_400
-200val_200
-97val_97
diff --git a/submarine-security/spark-security/src/test/resources/log4j.properties b/submarine-security/spark-security/src/test/resources/log4j.properties
deleted file mode 100644
index d633dc8..0000000
--- a/submarine-security/spark-security/src/test/resources/log4j.properties
+++ /dev/null
@@ -1,25 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-#   http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-# Set everything ERROR to be logged to the console
-log4j.rootCategory=ERROR, console
-log4j.appender.console=org.apache.log4j.ConsoleAppender
-log4j.appender.console.target=System.err
-log4j.appender.console.layout=org.apache.log4j.PatternLayout
-log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
-
-log4j.logger.org.apache.hadoop.security.ShellBasedUnixGroupsMapping=OFF
diff --git a/submarine-security/spark-security/src/test/resources/ranger-spark-audit.xml b/submarine-security/spark-security/src/test/resources/ranger-spark-audit.xml
deleted file mode 100644
index 32721d3..0000000
--- a/submarine-security/spark-security/src/test/resources/ranger-spark-audit.xml
+++ /dev/null
@@ -1,31 +0,0 @@
-<?xml version="1.0"?>
-<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
-<!--
-  ~ Licensed to the Apache Software Foundation (ASF) under one or more
-  ~ contributor license agreements. See the NOTICE file distributed with
-  ~ this work for additional information regarding copyright ownership.
-  ~ The ASF licenses this file to You under the Apache License, Version 2.0
-  ~ (the "License"); you may not use this file except in compliance with
-  ~ the License. You may obtain a copy of the License at
-  ~
-  ~   http://www.apache.org/licenses/LICENSE-2.0
-  ~
-  ~ Unless required by applicable law or agreed to in writing, software
-  ~ distributed under the License is distributed on an "AS IS" BASIS,
-  ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-  ~ See the License for the specific language governing permissions and
-  ~ limitations under the License.
-  -->
-<configuration xmlns:xi="http://www.w3.org/2001/XInclude">
-
-  <property>
-    <name>xasecure.audit.is.enabled</name>
-    <value>true</value>
-  </property>
-
-  <property>
-    <name>xasecure.audit.destination.db</name>
-    <value>false</value>
-  </property>
-
-</configuration>
diff --git a/submarine-security/spark-security/src/test/resources/ranger-spark-security.xml b/submarine-security/spark-security/src/test/resources/ranger-spark-security.xml
deleted file mode 100644
index 4999835..0000000
--- a/submarine-security/spark-security/src/test/resources/ranger-spark-security.xml
+++ /dev/null
@@ -1,45 +0,0 @@
-<?xml version="1.0"?>
-<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
-<!--
-  ~ Licensed to the Apache Software Foundation (ASF) under one or more
-  ~ contributor license agreements. See the NOTICE file distributed with
-  ~ this work for additional information regarding copyright ownership.
-  ~ The ASF licenses this file to You under the Apache License, Version 2.0
-  ~ (the "License"); you may not use this file except in compliance with
-  ~ the License. You may obtain a copy of the License at
-  ~
-  ~   http://www.apache.org/licenses/LICENSE-2.0
-  ~
-  ~ Unless required by applicable law or agreed to in writing, software
-  ~ distributed under the License is distributed on an "AS IS" BASIS,
-  ~ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-  ~ See the License for the specific language governing permissions and
-  ~ limitations under the License.
-  -->
-<configuration xmlns:xi="http://www.w3.org/2001/XInclude">
-
-  <property>
-    <name>ranger.plugin.spark.service.name</name>
-    <value>hive_jenkins</value>
-    <description>
-      Name of the Ranger service containing policies for this SampleApp instance
-    </description>
-  </property>
-
-  <property>
-    <name>ranger.plugin.spark.policy.source.impl</name>
-    <value>org.apache.submarine.spark.security.RangerAdminClientImpl</value>
-    <description>
-      Policy source.
-    </description>
-  </property>
-
-  <property>
-    <name>ranger.plugin.spark.policy.cache.dir</name>
-    <value>target/test-classes</value>
-    <description>
-      Directory where Ranger policies are cached after successful retrieval from the source
-    </description>
-  </property>
-
-</configuration>
diff --git a/submarine-security/spark-security/src/test/resources/sparkSql_hive_jenkins.json b/submarine-security/spark-security/src/test/resources/sparkSql_hive_jenkins.json
deleted file mode 100644
index 3a7b473..0000000
--- a/submarine-security/spark-security/src/test/resources/sparkSql_hive_jenkins.json
+++ /dev/null
@@ -1,2680 +0,0 @@
-{
-  "serviceName": "hive_jenkins",
-  "serviceId": 1,
-  "policyVersion": 85,
-  "policyUpdateTime": "20190429-21:36:09.000-+0800",
-  "policies": [
-    {
-      "service": "hive_jenkins",
-      "name": "all - url",
-      "policyType": 0,
-      "policyPriority": 0,
-      "description": "Policy for all - url",
-      "isAuditEnabled": true,
-      "resources": {
-        "url": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": true
-        }
-      },
-      "policyItems": [
-        {
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            },
-            {
-              "type": "update",
-              "isAllowed": true
-            },
-            {
-              "type": "create",
-              "isAllowed": true
-            },
-            {
-              "type": "drop",
-              "isAllowed": true
-            },
-            {
-              "type": "alter",
-              "isAllowed": true
-            },
-            {
-              "type": "index",
-              "isAllowed": true
-            },
-            {
-              "type": "lock",
-              "isAllowed": true
-            },
-            {
-              "type": "all",
-              "isAllowed": true
-            },
-            {
-              "type": "read",
-              "isAllowed": true
-            },
-            {
-              "type": "write",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "admin"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": true
-        }
-      ],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [],
-      "id": 1,
-      "guid": "cf7e6725-492f-434f-bffe-6bb4e3147246",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "all - database, table, column",
-      "policyType": 0,
-      "policyPriority": 0,
-      "description": "Policy for all - database, table, column",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [
-        {
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            },
-            {
-              "type": "update",
-              "isAllowed": true
-            },
-            {
-              "type": "create",
-              "isAllowed": true
-            },
-            {
-              "type": "drop",
-              "isAllowed": true
-            },
-            {
-              "type": "alter",
-              "isAllowed": true
-            },
-            {
-              "type": "index",
-              "isAllowed": true
-            },
-            {
-              "type": "lock",
-              "isAllowed": true
-            },
-            {
-              "type": "all",
-              "isAllowed": true
-            },
-            {
-              "type": "read",
-              "isAllowed": true
-            },
-            {
-              "type": "write",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "admin"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": true
-        }
-      ],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [],
-      "id": 2,
-      "guid": "3b96138a-af4d-48bc-9544-58c5bfa1979b",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "all - database, udf",
-      "policyType": 0,
-      "policyPriority": 0,
-      "description": "Policy for all - database, udf",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "udf": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [
-        {
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            },
-            {
-              "type": "update",
-              "isAllowed": true
-            },
-            {
-              "type": "create",
-              "isAllowed": true
-            },
-            {
-              "type": "drop",
-              "isAllowed": true
-            },
-            {
-              "type": "alter",
-              "isAllowed": true
-            },
-            {
-              "type": "index",
-              "isAllowed": true
-            },
-            {
-              "type": "lock",
-              "isAllowed": true
-            },
-            {
-              "type": "all",
-              "isAllowed": true
-            },
-            {
-              "type": "read",
-              "isAllowed": true
-            },
-            {
-              "type": "write",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "admin"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": true
-        }
-      ],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [],
-      "id": 3,
-      "guid": "db08fbb0-61da-4f33-8144-ccd89816151d",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "src_key _less_than_20",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "src"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "key\u003c20"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "serviceType": "hive",
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 4,
-      "guid": "f588a9ed-f7b1-48f7-9d0d-c12cf2b9b7ed",
-      "isEnabled": true,
-      "version": 26
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "default",
-      "policyType": 0,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [
-        {
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            },
-            {
-              "type": "update",
-              "isAllowed": true
-            },
-            {
-              "type": "create",
-              "isAllowed": true
-            },
-            {
-              "type": "drop",
-              "isAllowed": true
-            },
-            {
-              "type": "alter",
-              "isAllowed": true
-            },
-            {
-              "type": "index",
-              "isAllowed": true
-            },
-            {
-              "type": "lock",
-              "isAllowed": true
-            },
-            {
-              "type": "all",
-              "isAllowed": true
-            },
-            {
-              "type": "read",
-              "isAllowed": true
-            },
-            {
-              "type": "write",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 5,
-      "guid": "2db6099d-e4f1-41df-9d24-f2f47bed618e",
-      "isEnabled": true,
-      "version": 5
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "default_kent",
-      "policyType": 0,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "key"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "src"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [
-        {
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            },
-            {
-              "type": "update",
-              "isAllowed": true
-            },
-            {
-              "type": "create",
-              "isAllowed": true
-            },
-            {
-              "type": "drop",
-              "isAllowed": true
-            },
-            {
-              "type": "alter",
-              "isAllowed": true
-            },
-            {
-              "type": "index",
-              "isAllowed": true
-            },
-            {
-              "type": "lock",
-              "isAllowed": true
-            },
-            {
-              "type": "all",
-              "isAllowed": true
-            },
-            {
-              "type": "read",
-              "isAllowed": true
-            },
-            {
-              "type": "write",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "kent"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 5,
-      "guid": "fd24db19-f7cc-4e13-a8ba-bbd5a07a2d8d",
-      "isEnabled": true,
-      "version": 5
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "src_val_show_last_4",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "value"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "src"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_SHOW_LAST_4",
-            "valueExpr": ""
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 6,
-      "guid": "b1261fcc-b2cd-49f2-85e8-93f254f987ec",
-      "isEnabled": true,
-      "version": 10
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "store_sales",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "equality",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "store_sales"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "ss_sold_date_sk\u003d2451546"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 7,
-      "guid": "08fa307f-77fa-4586-83d0-21d0eb68b0fc",
-      "isEnabled": true,
-      "version": 4
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "default",
-      "policyType": 0,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "*"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [
-        {
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            },
-            {
-              "type": "update",
-              "isAllowed": true
-            },
-            {
-              "type": "create",
-              "isAllowed": true
-            },
-            {
-              "type": "drop",
-              "isAllowed": true
-            },
-            {
-              "type": "alter",
-              "isAllowed": true
-            },
-            {
-              "type": "index",
-              "isAllowed": true
-            },
-            {
-              "type": "lock",
-              "isAllowed": true
-            },
-            {
-              "type": "all",
-              "isAllowed": true
-            },
-            {
-              "type": "read",
-              "isAllowed": true
-            },
-            {
-              "type": "write",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 8,
-      "guid": "cfd49756-2d80-492d-bd26-6f67d531f28c",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "catalog_page",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "key in another table",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "catalog_page"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "cp_start_date_sk in (select d_date_sk from date_dim)"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 9,
-      "guid": "ec617d1b-b85d-434f-b9db-8ef0178620f1",
-      "isEnabled": true,
-      "version": 2
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "call_center",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "is not null",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "call_center"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "cc_name is not null"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 10,
-      "guid": "c8259509-61ae-48f8-867f-be8cac339764",
-      "isEnabled": true,
-      "version": 2
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "catalog_returns",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "or expression",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "catalog_returns"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "cr_item_sk is null or cr_item_sk \u003e\u003d0"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 11,
-      "guid": "58aa8789-799b-4ce7-820e-9ed625ff2206",
-      "isEnabled": true,
-      "version": 2
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "date_dim",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "AND and UDF",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "date_dim"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "d_date_sk\u003d0 and d_date\u003dcurrent_date()"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 12,
-      "guid": "cc7b3ede-e483-4ba9-9584-2907f3237df0",
-      "isEnabled": true,
-      "version": 2
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "reason",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "row filter expression with a key in the table itself",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "reason"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "r_reason_sk in (select r_reason_sk from reason)"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 14,
-      "guid": "4c8d06ae-73ea-4ff8-aedb-4aeae6865768",
-      "isEnabled": true,
-      "version": 2
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "inventory",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "scalar expression with the table itself",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "inventory"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "inv_item_sk\u003d(select count(1) from inventory)"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 15,
-      "guid": "1e3da1db-47f3-465e-a604-aaf3d3a8de8e",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "item_i_item_id",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "i_item_id"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "item"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_SHOW_LAST_4",
-            "valueExpr": ""
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 16,
-      "guid": "3bf13c7b-14b7-40cf-a7ed-913a3e528a11",
-      "isEnabled": true,
-      "version": 3
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "customer_address",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "ca_state"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "customer_address"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_SHOW_LAST_4",
-            "valueExpr": ""
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 17,
-      "guid": "a047b76d-ea97-4893-b469-94cc944b3edc",
-      "isEnabled": true,
-      "version": 4
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "customer",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "c_customer_id"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "customer"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 18,
-      "guid": "ac2d963e-635f-49a8-a96c-ded88f68e731",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "date_dim_2",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "d_year"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "date_dim"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_NULL"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 19,
-      "guid": "07e7df0d-2cf7-4630-b796-31798a4496d4",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "item_i_brand_id",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "i_brand_id"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "item"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_HASH"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 20,
-      "guid": "35b5e3f7-c9f0-42d1-9118-56dc37ff42f5",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "item_i_item_sk",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "i_item_sk"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "item"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_SHOW_FIRST_4"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 21,
-      "guid": "7e16c0ca-927a-4e95-b42e-c93b62cb6dfa",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "item_i_class_id",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "i_class_id"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "item"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_NULL"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 22,
-      "guid": "b7847238-3a14-4d56-8257-b8625a7f25a1",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl1_key_equals_0",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl1"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "key\u003d0"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 23,
-      "guid": "d52bc8de-2a6b-4f7c-ab26-fbaf22c05eb7",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl2_key_in_set",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl2"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "key in (0, 1, 2)"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 24,
-      "guid": "06008a40-9b33-4699-8782-cc7e85101b85",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl3_key_in_subquery",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl3"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "key in (select key from rangertbl2 where key \u003c 100)"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 25,
-      "guid": "d0ca382a-1d62-4faa-8b9b-aeb36d4e443e",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl4_key_in_self",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl4"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "key in (select key from rangertbl4)"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 26,
-      "guid": "b2b730af-d106-41f2-a21e-c29626adf6f3",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl5_key_udf",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl5"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "current_date()\u003d\"2019-04-28\""
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 27,
-      "guid": "0540df7e-fa14-4a41-b7d2-479fb42ddf5f",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl6_key_and_or",
-      "policyType": 2,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl6"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [],
-      "rowFilterPolicyItems": [
-        {
-          "rowFilterInfo": {
-            "filterExpr": "key\u003e1 and key\u003c10 or key \u003d500"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 28,
-      "guid": "5805bb62-291e-44b1-81e2-9f5c5b2b3cca",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl1_value_redact",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "value"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl1"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 29,
-      "guid": "9e7a290a-3d24-4f19-a4c6-2cf0637204ab",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl2_value_sf4",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "value"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl2"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_SHOW_FIRST_4"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 30,
-      "guid": "9d50a525-b24c-4cf5-a885-d10d426368d1",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl3_value_hash",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "value"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl3"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_HASH"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 31,
-      "guid": "ed1868a1-bf79-4721-a3d5-6815cc7d4986",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl4_value_nullify",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "value"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl4"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_NULL"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 32,
-      "guid": "98a04cd7-8d14-4466-adc9-126d87a3af69",
-      "isEnabled": true,
-      "version": 1
-    },
-    {
-      "service": "hive_jenkins",
-      "name": "rangertbl5_value_show_last_4",
-      "policyType": 1,
-      "policyPriority": 0,
-      "description": "",
-      "isAuditEnabled": true,
-      "resources": {
-        "database": {
-          "values": [
-            "default",
-            "spark_catalog"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "column": {
-          "values": [
-            "value"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        },
-        "table": {
-          "values": [
-            "rangertbl5"
-          ],
-          "isExcludes": false,
-          "isRecursive": false
-        }
-      },
-      "policyItems": [],
-      "denyPolicyItems": [],
-      "allowExceptions": [],
-      "denyExceptions": [],
-      "dataMaskPolicyItems": [
-        {
-          "dataMaskInfo": {
-            "dataMaskType": "MASK_SHOW_LAST_4"
-          },
-          "accesses": [
-            {
-              "type": "select",
-              "isAllowed": true
-            }
-          ],
-          "users": [
-            "bob"
-          ],
-          "groups": [],
-          "conditions": [],
-          "delegateAdmin": false
-        }
-      ],
-      "rowFilterPolicyItems": [],
-      "options": {},
-      "validitySchedules": [],
-      "policyLabels": [
-        ""
-      ],
-      "id": 32,
-      "guid": "b3f1f1e0-2bd6-4b20-8a32-a531006ae151",
-      "isEnabled": true,
-      "version": 1
-    }
-  ],
-  "serviceDef": {
-    "name": "hive",
-    "implClass": "org.apache.ranger.services.hive.RangerServiceHive",
-    "label": "Hive Server2",
-    "description": "Hive Server2",
-    "options": {
-      "enableDenyAndExceptionsInPolicies": "true"
-    },
-    "configs": [
-      {
-        "itemId": 1,
-        "name": "username",
-        "type": "string",
-        "mandatory": true,
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": "",
-        "label": "Username"
-      },
-      {
-        "itemId": 2,
-        "name": "password",
-        "type": "password",
-        "mandatory": true,
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": "",
-        "label": "Password"
-      },
-      {
-        "itemId": 3,
-        "name": "jdbc.driverClassName",
-        "type": "string",
-        "mandatory": true,
-        "defaultValue": "org.apache.hive.jdbc.HiveDriver",
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": ""
-      },
-      {
-        "itemId": 4,
-        "name": "jdbc.url",
-        "type": "string",
-        "mandatory": true,
-        "defaultValue": "",
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": "{\"TextFieldWithIcon\":true, \"info\": \"1.For Remote Mode, eg.\u003cbr\u003ejdbc:hive2://\u0026lt;host\u0026gt;:\u0026lt;port\u0026gt;\u003cbr\u003e2.For Embedded Mode (no host or port), eg.\u003cbr\u003ejdbc:hive2:///;initFile\u003d\u0026lt;file\u0026gt;\u003cbr\u003e3.For HTTP Mode, eg.\u003cbr\u003ejdbc:hive2://\u0026lt;host\u0026gt;:\u0026lt;port\u0026gt;/;\u003cbr\u003etransportMode\u003dhttp;httpPath\u003d\u0026lt;httpPath\u0026gt;\u003cbr\u003e4.For SSL Mode, e [...]
-      },
-      {
-        "itemId": 5,
-        "name": "commonNameForCertificate",
-        "type": "string",
-        "mandatory": false,
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": "",
-        "label": "Common Name for Certificate"
-      }
-    ],
-    "resources": [
-      {
-        "itemId": 1,
-        "name": "database",
-        "type": "string",
-        "level": 10,
-        "mandatory": true,
-        "lookupSupported": true,
-        "recursiveSupported": false,
-        "excludesSupported": true,
-        "matcher": "org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher",
-        "matcherOptions": {
-          "wildCard": "true",
-          "ignoreCase": "true"
-        },
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": "",
-        "label": "Hive Database",
-        "description": "Hive Database",
-        "accessTypeRestrictions": [],
-        "isValidLeaf": false
-      },
-      {
-        "itemId": 5,
-        "name": "url",
-        "type": "string",
-        "level": 10,
-        "mandatory": true,
-        "lookupSupported": false,
-        "recursiveSupported": true,
-        "excludesSupported": false,
-        "matcher": "org.apache.ranger.plugin.resourcematcher.RangerPathResourceMatcher",
-        "matcherOptions": {
-          "wildCard": "true",
-          "ignoreCase": "false"
-        },
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": "",
-        "label": "URL",
-        "description": "URL",
-        "accessTypeRestrictions": [],
-        "isValidLeaf": true
-      },
-      {
-        "itemId": 2,
-        "name": "table",
-        "type": "string",
-        "level": 20,
-        "parent": "database",
-        "mandatory": true,
-        "lookupSupported": true,
-        "recursiveSupported": false,
-        "excludesSupported": true,
-        "matcher": "org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher",
-        "matcherOptions": {
-          "wildCard": "true",
-          "ignoreCase": "true"
-        },
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": "",
-        "label": "Hive Table",
-        "description": "Hive Table",
-        "accessTypeRestrictions": [],
-        "isValidLeaf": false
-      },
-      {
-        "itemId": 3,
-        "name": "udf",
-        "type": "string",
-        "level": 20,
-        "parent": "database",
-        "mandatory": true,
-        "lookupSupported": true,
-        "recursiveSupported": false,
-        "excludesSupported": true,
-        "matcher": "org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher",
-        "matcherOptions": {
-          "wildCard": "true",
-          "ignoreCase": "true"
-        },
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": "",
-        "label": "Hive UDF",
-        "description": "Hive UDF",
-        "accessTypeRestrictions": [],
-        "isValidLeaf": true
-      },
-      {
-        "itemId": 4,
-        "name": "column",
-        "type": "string",
-        "level": 30,
-        "parent": "table",
-        "mandatory": true,
-        "lookupSupported": true,
-        "recursiveSupported": false,
-        "excludesSupported": true,
-        "matcher": "org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher",
-        "matcherOptions": {
-          "wildCard": "true",
-          "ignoreCase": "true"
-        },
-        "validationRegEx": "",
-        "validationMessage": "",
-        "uiHint": "",
-        "label": "Hive Column",
-        "description": "Hive Column",
-        "accessTypeRestrictions": [],
-        "isValidLeaf": true
-      }
-    ],
-    "accessTypes": [
-      {
-        "itemId": 1,
-        "name": "select",
-        "label": "select",
-        "impliedGrants": []
-      },
-      {
-        "itemId": 2,
-        "name": "update",
-        "label": "update",
-        "impliedGrants": []
-      },
-      {
-        "itemId": 3,
-        "name": "create",
-        "label": "Create",
-        "impliedGrants": []
-      },
-      {
-        "itemId": 4,
-        "name": "drop",
-        "label": "Drop",
-        "impliedGrants": []
-      },
-      {
-        "itemId": 5,
-        "name": "alter",
-        "label": "Alter",
-        "impliedGrants": []
-      },
-      {
-        "itemId": 6,
-        "name": "index",
-        "label": "Index",
-        "impliedGrants": []
-      },
-      {
-        "itemId": 7,
-        "name": "lock",
-        "label": "Lock",
-        "impliedGrants": []
-      },
-      {
-        "itemId": 8,
-        "name": "all",
-        "label": "All",
-        "impliedGrants": [
-          "select",
-          "update",
-          "create",
-          "drop",
-          "alter",
-          "index",
-          "lock",
-          "read",
-          "write"
-        ]
-      },
-      {
-        "itemId": 9,
-        "name": "read",
-        "label": "Read",
-        "impliedGrants": []
-      },
-      {
-        "itemId": 10,
-        "name": "write",
-        "label": "Write",
-        "impliedGrants": []
-      }
-    ],
-    "policyConditions": [],
-    "contextEnrichers": [],
-    "enums": [],
-    "dataMaskDef": {
-      "maskTypes": [
-        {
-          "itemId": 1,
-          "name": "MASK",
-          "label": "Redact",
-          "description": "Replace lowercase with \u0027x\u0027, uppercase with \u0027X\u0027, digits with \u00270\u0027",
-          "transformer": "mask({col})",
-          "dataMaskOptions": {}
-        },
-        {
-          "itemId": 2,
-          "name": "MASK_SHOW_LAST_4",
-          "label": "Partial mask: show last 4",
-          "description": "Show last 4 characters; replace rest with \u0027x\u0027",
-          "transformer": "mask_show_last_n({col}, 4, \u0027x\u0027, \u0027x\u0027, \u0027x\u0027, -1, \u00271\u0027)",
-          "dataMaskOptions": {}
-        },
-        {
-          "itemId": 3,
-          "name": "MASK_SHOW_FIRST_4",
-          "label": "Partial mask: show first 4",
-          "description": "Show first 4 characters; replace rest with \u0027x\u0027",
-          "transformer": "mask_show_first_n({col}, 4, \u0027x\u0027, \u0027x\u0027, \u0027x\u0027, -1, \u00271\u0027)",
-          "dataMaskOptions": {}
-        },
-        {
-          "itemId": 4,
-          "name": "MASK_HASH",
-          "label": "Hash",
-          "description": "Hash the value",
-          "transformer": "mask_hash({col})",
-          "dataMaskOptions": {}
-        },
-        {
-          "itemId": 5,
-          "name": "MASK_NULL",
-          "label": "Nullify",
-          "description": "Replace with NULL",
-          "dataMaskOptions": {}
-        },
-        {
-          "itemId": 6,
-          "name": "MASK_NONE",
-          "label": "Unmasked (retain original value)",
-          "description": "No masking",
-          "dataMaskOptions": {}
-        },
-        {
-          "itemId": 12,
-          "name": "MASK_DATE_SHOW_YEAR",
-          "label": "Date: show only year",
-          "description": "Date: show only year",
-          "transformer": "mask({col}, \u0027x\u0027, \u0027x\u0027, \u0027x\u0027, -1, \u00271\u0027, 1, 0, -1)",
-          "dataMaskOptions": {}
-        },
-        {
-          "itemId": 13,
-          "name": "CUSTOM",
-          "label": "Custom",
-          "description": "Custom",
-          "dataMaskOptions": {}
-        }
-      ],
-      "accessTypes": [
-        {
-          "itemId": 1,
-          "name": "select",
-          "label": "select",
-          "impliedGrants": []
-        }
-      ],
-      "resources": [
-        {
-          "itemId": 1,
-          "name": "database",
-          "type": "string",
-          "level": 10,
-          "mandatory": true,
-          "lookupSupported": true,
-          "recursiveSupported": false,
-          "excludesSupported": false,
-          "matcher": "org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher",
-          "matcherOptions": {
-            "wildCard": "false",
-            "ignoreCase": "true"
-          },
-          "validationRegEx": "",
-          "validationMessage": "",
-          "uiHint": "{ \"singleValue\":true }",
-          "label": "Hive Database",
-          "description": "Hive Database",
-          "accessTypeRestrictions": [],
-          "isValidLeaf": false
-        },
-        {
-          "itemId": 2,
-          "name": "table",
-          "type": "string",
-          "level": 20,
-          "parent": "database",
-          "mandatory": true,
-          "lookupSupported": true,
-          "recursiveSupported": false,
-          "excludesSupported": false,
-          "matcher": "org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher",
-          "matcherOptions": {
-            "wildCard": "false",
-            "ignoreCase": "true"
-          },
-          "validationRegEx": "",
-          "validationMessage": "",
-          "uiHint": "{ \"singleValue\":true }",
-          "label": "Hive Table",
-          "description": "Hive Table",
-          "accessTypeRestrictions": [],
-          "isValidLeaf": false
-        },
-        {
-          "itemId": 4,
-          "name": "column",
-          "type": "string",
-          "level": 30,
-          "parent": "table",
-          "mandatory": true,
-          "lookupSupported": true,
-          "recursiveSupported": false,
-          "excludesSupported": false,
-          "matcher": "org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher",
-          "matcherOptions": {
-            "wildCard": "false",
-            "ignoreCase": "true"
-          },
-          "validationRegEx": "",
-          "validationMessage": "",
-          "uiHint": "{ \"singleValue\":true }",
-          "label": "Hive Column",
-          "description": "Hive Column",
-          "accessTypeRestrictions": [],
-          "isValidLeaf": true
-        }
-      ]
-    },
-    "rowFilterDef": {
-      "accessTypes": [
-        {
-          "itemId": 1,
-          "name": "select",
-          "label": "select",
-          "impliedGrants": []
-        }
-      ],
-      "resources": [
-        {
-          "itemId": 1,
-          "name": "database",
-          "type": "string",
-          "level": 10,
-          "mandatory": true,
-          "lookupSupported": true,
-          "recursiveSupported": false,
-          "excludesSupported": false,
-          "matcher": "org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher",
-          "matcherOptions": {
-            "wildCard": "false",
-            "ignoreCase": "true"
-          },
-          "validationRegEx": "",
-          "validationMessage": "",
-          "uiHint": "{ \"singleValue\":true }",
-          "label": "Hive Database",
-          "description": "Hive Database",
-          "accessTypeRestrictions": [],
-          "isValidLeaf": false
-        },
-        {
-          "itemId": 2,
-          "name": "table",
-          "type": "string",
-          "level": 20,
-          "parent": "database",
-          "mandatory": true,
-          "lookupSupported": true,
-          "recursiveSupported": false,
-          "excludesSupported": false,
-          "matcher": "org.apache.ranger.plugin.resourcematcher.RangerDefaultResourceMatcher",
-          "matcherOptions": {
-            "wildCard": "false",
-            "ignoreCase": "true"
-          },
-          "validationRegEx": "",
-          "validationMessage": "",
-          "uiHint": "{ \"singleValue\":true }",
-          "label": "Hive Table",
-          "description": "Hive Table",
-          "accessTypeRestrictions": [],
-          "isValidLeaf": true
-        }
-      ]
-    },
-    "id": 3,
-    "guid": "3e1afb5a-184a-4e82-9d9c-87a5cacc243c",
-    "isEnabled": true,
-    "createTime": "20190401-20:14:36.000-+0800",
-    "updateTime": "20190401-20:14:36.000-+0800",
-    "version": 1
-  },
-  "auditMode": "audit-default"
-}
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q1.sql b/submarine-security/spark-security/src/test/resources/tpcds/q1.sql
deleted file mode 100755
index 70a742d..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q1.sql
+++ /dev/null
@@ -1,34 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-WITH customer_total_return AS
-( SELECT
-    sr_customer_sk AS ctr_customer_sk,
-    sr_store_sk AS ctr_store_sk,
-    sum(sr_return_amt) AS ctr_total_return
-  FROM store_returns, date_dim
-  WHERE sr_returned_date_sk = d_date_sk AND d_year = 2000
-  GROUP BY sr_customer_sk, sr_store_sk)
-SELECT c_customer_id
-FROM customer_total_return ctr1, store, customer
-WHERE ctr1.ctr_total_return >
-  (SELECT avg(ctr_total_return) * 1.2
-  FROM customer_total_return ctr2
-  WHERE ctr1.ctr_store_sk = ctr2.ctr_store_sk)
-  AND s_store_sk = ctr1.ctr_store_sk
-  AND s_state = 'TN'
-  AND ctr1.ctr_customer_sk = c_customer_sk
-ORDER BY c_customer_id
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q10.sql b/submarine-security/spark-security/src/test/resources/tpcds/q10.sql
deleted file mode 100755
index 5469f13..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q10.sql
+++ /dev/null
@@ -1,72 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  cd_gender,
-  cd_marital_status,
-  cd_education_status,
-  count(*) cnt1,
-  cd_purchase_estimate,
-  count(*) cnt2,
-  cd_credit_rating,
-  count(*) cnt3,
-  cd_dep_count,
-  count(*) cnt4,
-  cd_dep_employed_count,
-  count(*) cnt5,
-  cd_dep_college_count,
-  count(*) cnt6
-FROM
-  customer c, customer_address ca, customer_demographics
-WHERE
-  c.c_current_addr_sk = ca.ca_address_sk AND
-    ca_county IN ('Rush County', 'Toole County', 'Jefferson County',
-                  'Dona Ana County', 'La Porte County') AND
-    cd_demo_sk = c.c_current_cdemo_sk AND
-    exists(SELECT *
-           FROM store_sales, date_dim
-           WHERE c.c_customer_sk = ss_customer_sk AND
-             ss_sold_date_sk = d_date_sk AND
-             d_year = 2002 AND
-             d_moy BETWEEN 1 AND 1 + 3) AND
-    (exists(SELECT *
-            FROM web_sales, date_dim
-            WHERE c.c_customer_sk = ws_bill_customer_sk AND
-              ws_sold_date_sk = d_date_sk AND
-              d_year = 2002 AND
-              d_moy BETWEEN 1 AND 1 + 3) OR
-      exists(SELECT *
-             FROM catalog_sales, date_dim
-             WHERE c.c_customer_sk = cs_ship_customer_sk AND
-               cs_sold_date_sk = d_date_sk AND
-               d_year = 2002 AND
-               d_moy BETWEEN 1 AND 1 + 3))
-GROUP BY cd_gender,
-  cd_marital_status,
-  cd_education_status,
-  cd_purchase_estimate,
-  cd_credit_rating,
-  cd_dep_count,
-  cd_dep_employed_count,
-  cd_dep_college_count
-ORDER BY cd_gender,
-  cd_marital_status,
-  cd_education_status,
-  cd_purchase_estimate,
-  cd_credit_rating,
-  cd_dep_count,
-  cd_dep_employed_count,
-  cd_dep_college_count
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q11.sql b/submarine-security/spark-security/src/test/resources/tpcds/q11.sql
deleted file mode 100755
index 24e3899..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q11.sql
+++ /dev/null
@@ -1,83 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-WITH year_total AS (
-  SELECT
-    c_customer_id customer_id,
-    c_first_name customer_first_name,
-    c_last_name customer_last_name,
-    c_preferred_cust_flag customer_preferred_cust_flag,
-    c_birth_country customer_birth_country,
-    c_login customer_login,
-    c_email_address customer_email_address,
-    d_year dyear,
-    sum(ss_ext_list_price - ss_ext_discount_amt) year_total,
-    's' sale_type
-  FROM customer, store_sales, date_dim
-  WHERE c_customer_sk = ss_customer_sk
-    AND ss_sold_date_sk = d_date_sk
-  GROUP BY c_customer_id
-    , c_first_name
-    , c_last_name
-    , d_year
-    , c_preferred_cust_flag
-    , c_birth_country
-    , c_login
-    , c_email_address
-    , d_year
-  UNION ALL
-  SELECT
-    c_customer_id customer_id,
-    c_first_name customer_first_name,
-    c_last_name customer_last_name,
-    c_preferred_cust_flag customer_preferred_cust_flag,
-    c_birth_country customer_birth_country,
-    c_login customer_login,
-    c_email_address customer_email_address,
-    d_year dyear,
-    sum(ws_ext_list_price - ws_ext_discount_amt) year_total,
-    'w' sale_type
-  FROM customer, web_sales, date_dim
-  WHERE c_customer_sk = ws_bill_customer_sk
-    AND ws_sold_date_sk = d_date_sk
-  GROUP BY
-    c_customer_id, c_first_name, c_last_name, c_preferred_cust_flag, c_birth_country,
-    c_login, c_email_address, d_year)
-SELECT t_s_secyear.customer_preferred_cust_flag
-FROM year_total t_s_firstyear
-  , year_total t_s_secyear
-  , year_total t_w_firstyear
-  , year_total t_w_secyear
-WHERE t_s_secyear.customer_id = t_s_firstyear.customer_id
-  AND t_s_firstyear.customer_id = t_w_secyear.customer_id
-  AND t_s_firstyear.customer_id = t_w_firstyear.customer_id
-  AND t_s_firstyear.sale_type = 's'
-  AND t_w_firstyear.sale_type = 'w'
-  AND t_s_secyear.sale_type = 's'
-  AND t_w_secyear.sale_type = 'w'
-  AND t_s_firstyear.dyear = 2001
-  AND t_s_secyear.dyear = 2001 + 1
-  AND t_w_firstyear.dyear = 2001
-  AND t_w_secyear.dyear = 2001 + 1
-  AND t_s_firstyear.year_total > 0
-  AND t_w_firstyear.year_total > 0
-  AND CASE WHEN t_w_firstyear.year_total > 0
-  THEN t_w_secyear.year_total / t_w_firstyear.year_total
-      ELSE NULL END
-  > CASE WHEN t_s_firstyear.year_total > 0
-  THEN t_s_secyear.year_total / t_s_firstyear.year_total
-    ELSE NULL END
-ORDER BY t_s_secyear.customer_preferred_cust_flag
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q12.sql b/submarine-security/spark-security/src/test/resources/tpcds/q12.sql
deleted file mode 100755
index a7b8124..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q12.sql
+++ /dev/null
@@ -1,37 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  i_item_desc,
-  i_category,
-  i_class,
-  i_current_price,
-  sum(ws_ext_sales_price) AS itemrevenue,
-  sum(ws_ext_sales_price) * 100 / sum(sum(ws_ext_sales_price))
-  OVER
-  (PARTITION BY i_class) AS revenueratio
-FROM
-  web_sales, item, date_dim
-WHERE
-  ws_item_sk = i_item_sk
-    AND i_category IN ('Sports', 'Books', 'Home')
-    AND ws_sold_date_sk = d_date_sk
-    AND d_date BETWEEN cast('1999-02-22' AS DATE)
-  AND (cast('1999-02-22' AS DATE) + INTERVAL 30 days)
-GROUP BY
-  i_item_id, i_item_desc, i_category, i_class, i_current_price
-ORDER BY
-  i_category, i_class, i_item_id, i_item_desc, revenueratio
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q13.sql b/submarine-security/spark-security/src/test/resources/tpcds/q13.sql
deleted file mode 100755
index 4a1c40d..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q13.sql
+++ /dev/null
@@ -1,64 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  avg(ss_quantity),
-  avg(ss_ext_sales_price),
-  avg(ss_ext_wholesale_cost),
-  sum(ss_ext_wholesale_cost)
-FROM store_sales
-  , store
-  , customer_demographics
-  , household_demographics
-  , customer_address
-  , date_dim
-WHERE s_store_sk = ss_store_sk
-  AND ss_sold_date_sk = d_date_sk AND d_year = 2001
-  AND ((ss_hdemo_sk = hd_demo_sk
-  AND cd_demo_sk = ss_cdemo_sk
-  AND cd_marital_status = 'M'
-  AND cd_education_status = 'Advanced Degree'
-  AND ss_sales_price BETWEEN 100.00 AND 150.00
-  AND hd_dep_count = 3
-) OR
-  (ss_hdemo_sk = hd_demo_sk
-    AND cd_demo_sk = ss_cdemo_sk
-    AND cd_marital_status = 'S'
-    AND cd_education_status = 'College'
-    AND ss_sales_price BETWEEN 50.00 AND 100.00
-    AND hd_dep_count = 1
-  ) OR
-  (ss_hdemo_sk = hd_demo_sk
-    AND cd_demo_sk = ss_cdemo_sk
-    AND cd_marital_status = 'W'
-    AND cd_education_status = '2 yr Degree'
-    AND ss_sales_price BETWEEN 150.00 AND 200.00
-    AND hd_dep_count = 1
-  ))
-  AND ((ss_addr_sk = ca_address_sk
-  AND ca_country = 'United States'
-  AND ca_state IN ('TX', 'OH', 'TX')
-  AND ss_net_profit BETWEEN 100 AND 200
-) OR
-  (ss_addr_sk = ca_address_sk
-    AND ca_country = 'United States'
-    AND ca_state IN ('OR', 'NM', 'KY')
-    AND ss_net_profit BETWEEN 150 AND 300
-  ) OR
-  (ss_addr_sk = ca_address_sk
-    AND ca_country = 'United States'
-    AND ca_state IN ('VA', 'TX', 'MS')
-    AND ss_net_profit BETWEEN 50 AND 250
-  ))
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q14a.sql b/submarine-security/spark-security/src/test/resources/tpcds/q14a.sql
deleted file mode 100755
index 8259c23..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q14a.sql
+++ /dev/null
@@ -1,135 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-WITH cross_items AS
-(SELECT i_item_sk ss_item_sk
-  FROM item,
-    (SELECT
-      iss.i_brand_id brand_id,
-      iss.i_class_id class_id,
-      iss.i_category_id category_id
-    FROM store_sales, item iss, date_dim d1
-    WHERE ss_item_sk = iss.i_item_sk
-      AND ss_sold_date_sk = d1.d_date_sk
-      AND d1.d_year BETWEEN 1999 AND 1999 + 2
-    INTERSECT
-    SELECT
-      ics.i_brand_id,
-      ics.i_class_id,
-      ics.i_category_id
-    FROM catalog_sales, item ics, date_dim d2
-    WHERE cs_item_sk = ics.i_item_sk
-      AND cs_sold_date_sk = d2.d_date_sk
-      AND d2.d_year BETWEEN 1999 AND 1999 + 2
-    INTERSECT
-    SELECT
-      iws.i_brand_id,
-      iws.i_class_id,
-      iws.i_category_id
-    FROM web_sales, item iws, date_dim d3
-    WHERE ws_item_sk = iws.i_item_sk
-      AND ws_sold_date_sk = d3.d_date_sk
-      AND d3.d_year BETWEEN 1999 AND 1999 + 2) x
-  WHERE i_brand_id = brand_id
-    AND i_class_id = class_id
-    AND i_category_id = category_id
-),
-    avg_sales AS
-  (SELECT avg(quantity * list_price) average_sales
-  FROM (
-         SELECT
-           ss_quantity quantity,
-           ss_list_price list_price
-         FROM store_sales, date_dim
-         WHERE ss_sold_date_sk = d_date_sk
-           AND d_year BETWEEN 1999 AND 2001
-         UNION ALL
-         SELECT
-           cs_quantity quantity,
-           cs_list_price list_price
-         FROM catalog_sales, date_dim
-         WHERE cs_sold_date_sk = d_date_sk
-           AND d_year BETWEEN 1999 AND 1999 + 2
-         UNION ALL
-         SELECT
-           ws_quantity quantity,
-           ws_list_price list_price
-         FROM web_sales, date_dim
-         WHERE ws_sold_date_sk = d_date_sk
-           AND d_year BETWEEN 1999 AND 1999 + 2) x)
-SELECT
-  channel,
-  i_brand_id,
-  i_class_id,
-  i_category_id,
-  sum(sales),
-  sum(number_sales)
-FROM (
-       SELECT
-         'store' channel,
-         i_brand_id,
-         i_class_id,
-         i_category_id,
-         sum(ss_quantity * ss_list_price) sales,
-         count(*) number_sales
-       FROM store_sales, item, date_dim
-       WHERE ss_item_sk IN (SELECT ss_item_sk
-       FROM cross_items)
-         AND ss_item_sk = i_item_sk
-         AND ss_sold_date_sk = d_date_sk
-         AND d_year = 1999 + 2
-         AND d_moy = 11
-       GROUP BY i_brand_id, i_class_id, i_category_id
-       HAVING sum(ss_quantity * ss_list_price) > (SELECT average_sales
-       FROM avg_sales)
-       UNION ALL
-       SELECT
-         'catalog' channel,
-         i_brand_id,
-         i_class_id,
-         i_category_id,
-         sum(cs_quantity * cs_list_price) sales,
-         count(*) number_sales
-       FROM catalog_sales, item, date_dim
-       WHERE cs_item_sk IN (SELECT ss_item_sk
-       FROM cross_items)
-         AND cs_item_sk = i_item_sk
-         AND cs_sold_date_sk = d_date_sk
-         AND d_year = 1999 + 2
-         AND d_moy = 11
-       GROUP BY i_brand_id, i_class_id, i_category_id
-       HAVING sum(cs_quantity * cs_list_price) > (SELECT average_sales FROM avg_sales)
-       UNION ALL
-       SELECT
-         'web' channel,
-         i_brand_id,
-         i_class_id,
-         i_category_id,
-         sum(ws_quantity * ws_list_price) sales,
-         count(*) number_sales
-       FROM web_sales, item, date_dim
-       WHERE ws_item_sk IN (SELECT ss_item_sk
-       FROM cross_items)
-         AND ws_item_sk = i_item_sk
-         AND ws_sold_date_sk = d_date_sk
-         AND d_year = 1999 + 2
-         AND d_moy = 11
-       GROUP BY i_brand_id, i_class_id, i_category_id
-       HAVING sum(ws_quantity * ws_list_price) > (SELECT average_sales
-       FROM avg_sales)
-     ) y
-GROUP BY ROLLUP (channel, i_brand_id, i_class_id, i_category_id)
-ORDER BY channel, i_brand_id, i_class_id, i_category_id
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q14b.sql b/submarine-security/spark-security/src/test/resources/tpcds/q14b.sql
deleted file mode 100755
index 6beb73c..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q14b.sql
+++ /dev/null
@@ -1,110 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-WITH cross_items AS
-(SELECT i_item_sk ss_item_sk
-  FROM item,
-    (SELECT
-      iss.i_brand_id brand_id,
-      iss.i_class_id class_id,
-      iss.i_category_id category_id
-    FROM store_sales, item iss, date_dim d1
-    WHERE ss_item_sk = iss.i_item_sk
-      AND ss_sold_date_sk = d1.d_date_sk
-      AND d1.d_year BETWEEN 1999 AND 1999 + 2
-    INTERSECT
-    SELECT
-      ics.i_brand_id,
-      ics.i_class_id,
-      ics.i_category_id
-    FROM catalog_sales, item ics, date_dim d2
-    WHERE cs_item_sk = ics.i_item_sk
-      AND cs_sold_date_sk = d2.d_date_sk
-      AND d2.d_year BETWEEN 1999 AND 1999 + 2
-    INTERSECT
-    SELECT
-      iws.i_brand_id,
-      iws.i_class_id,
-      iws.i_category_id
-    FROM web_sales, item iws, date_dim d3
-    WHERE ws_item_sk = iws.i_item_sk
-      AND ws_sold_date_sk = d3.d_date_sk
-      AND d3.d_year BETWEEN 1999 AND 1999 + 2) x
-  WHERE i_brand_id = brand_id
-    AND i_class_id = class_id
-    AND i_category_id = category_id
-),
-    avg_sales AS
-  (SELECT avg(quantity * list_price) average_sales
-  FROM (SELECT
-          ss_quantity quantity,
-          ss_list_price list_price
-        FROM store_sales, date_dim
-        WHERE ss_sold_date_sk = d_date_sk AND d_year BETWEEN 1999 AND 1999 + 2
-        UNION ALL
-        SELECT
-          cs_quantity quantity,
-          cs_list_price list_price
-        FROM catalog_sales, date_dim
-        WHERE cs_sold_date_sk = d_date_sk AND d_year BETWEEN 1999 AND 1999 + 2
-        UNION ALL
-        SELECT
-          ws_quantity quantity,
-          ws_list_price list_price
-        FROM web_sales, date_dim
-        WHERE ws_sold_date_sk = d_date_sk AND d_year BETWEEN 1999 AND 1999 + 2) x)
-SELECT *
-FROM
-  (SELECT
-    'store' channel,
-    i_brand_id,
-    i_class_id,
-    i_category_id,
-    sum(ss_quantity * ss_list_price) sales,
-    count(*) number_sales
-  FROM store_sales, item, date_dim
-  WHERE ss_item_sk IN (SELECT ss_item_sk
-  FROM cross_items)
-    AND ss_item_sk = i_item_sk
-    AND ss_sold_date_sk = d_date_sk
-    AND d_week_seq = (SELECT d_week_seq
-  FROM date_dim
-  WHERE d_year = 1999 + 1 AND d_moy = 12 AND d_dom = 11)
-  GROUP BY i_brand_id, i_class_id, i_category_id
-  HAVING sum(ss_quantity * ss_list_price) > (SELECT average_sales
-  FROM avg_sales)) this_year,
-  (SELECT
-    'store' channel,
-    i_brand_id,
-    i_class_id,
-    i_category_id,
-    sum(ss_quantity * ss_list_price) sales,
-    count(*) number_sales
-  FROM store_sales, item, date_dim
-  WHERE ss_item_sk IN (SELECT ss_item_sk
-  FROM cross_items)
-    AND ss_item_sk = i_item_sk
-    AND ss_sold_date_sk = d_date_sk
-    AND d_week_seq = (SELECT d_week_seq
-  FROM date_dim
-  WHERE d_year = 1999 AND d_moy = 12 AND d_dom = 11)
-  GROUP BY i_brand_id, i_class_id, i_category_id
-  HAVING sum(ss_quantity * ss_list_price) > (SELECT average_sales
-  FROM avg_sales)) last_year
-WHERE this_year.i_brand_id = last_year.i_brand_id
-  AND this_year.i_class_id = last_year.i_class_id
-  AND this_year.i_category_id = last_year.i_category_id
-ORDER BY this_year.channel, this_year.i_brand_id, this_year.i_class_id, this_year.i_category_id
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q15.sql b/submarine-security/spark-security/src/test/resources/tpcds/q15.sql
deleted file mode 100755
index 64e98b8..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q15.sql
+++ /dev/null
@@ -1,30 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  ca_zip,
-  sum(cs_sales_price)
-FROM catalog_sales, customer, customer_address, date_dim
-WHERE cs_bill_customer_sk = c_customer_sk
-  AND c_current_addr_sk = ca_address_sk
-  AND (substr(ca_zip, 1, 5) IN ('85669', '86197', '88274', '83405', '86475',
-                                '85392', '85460', '80348', '81792')
-  OR ca_state IN ('CA', 'WA', 'GA')
-  OR cs_sales_price > 500)
-  AND cs_sold_date_sk = d_date_sk
-  AND d_qoy = 2 AND d_year = 2001
-GROUP BY ca_zip
-ORDER BY ca_zip
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q16.sql b/submarine-security/spark-security/src/test/resources/tpcds/q16.sql
deleted file mode 100755
index 25508b9..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q16.sql
+++ /dev/null
@@ -1,38 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  count(DISTINCT cs_order_number) AS `order count `,
-  sum(cs_ext_ship_cost) AS `total shipping cost `,
-  sum(cs_net_profit) AS `total net profit `
-FROM
-  catalog_sales cs1, date_dim, customer_address, call_center
-WHERE
-  d_date BETWEEN '2002-02-01' AND (CAST('2002-02-01' AS DATE) + INTERVAL 60 days)
-    AND cs1.cs_ship_date_sk = d_date_sk
-    AND cs1.cs_ship_addr_sk = ca_address_sk
-    AND ca_state = 'GA'
-    AND cs1.cs_call_center_sk = cc_call_center_sk
-    AND cc_county IN
-    ('Williamson County', 'Williamson County', 'Williamson County', 'Williamson County', 'Williamson County')
-    AND EXISTS(SELECT *
-               FROM catalog_sales cs2
-               WHERE cs1.cs_order_number = cs2.cs_order_number
-                 AND cs1.cs_warehouse_sk <> cs2.cs_warehouse_sk)
-    AND NOT EXISTS(SELECT *
-                   FROM catalog_returns cr1
-                   WHERE cs1.cs_order_number = cr1.cr_order_number)
-ORDER BY count(DISTINCT cs_order_number)
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q17.sql b/submarine-security/spark-security/src/test/resources/tpcds/q17.sql
deleted file mode 100755
index 9854cb0..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q17.sql
+++ /dev/null
@@ -1,48 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  i_item_id,
-  i_item_desc,
-  s_state,
-  count(ss_quantity) AS store_sales_quantitycount,
-  avg(ss_quantity) AS store_sales_quantityave,
-  stddev_samp(ss_quantity) AS store_sales_quantitystdev,
-  stddev_samp(ss_quantity) / avg(ss_quantity) AS store_sales_quantitycov,
-  count(sr_return_quantity) as_store_returns_quantitycount,
-  avg(sr_return_quantity) as_store_returns_quantityave,
-  stddev_samp(sr_return_quantity) as_store_returns_quantitystdev,
-  stddev_samp(sr_return_quantity) / avg(sr_return_quantity) AS store_returns_quantitycov,
-  count(cs_quantity) AS catalog_sales_quantitycount,
-  avg(cs_quantity) AS catalog_sales_quantityave,
-  stddev_samp(cs_quantity) / avg(cs_quantity) AS catalog_sales_quantitystdev,
-  stddev_samp(cs_quantity) / avg(cs_quantity) AS catalog_sales_quantitycov
-FROM store_sales, store_returns, catalog_sales, date_dim d1, date_dim d2, date_dim d3, store, item
-WHERE d1.d_quarter_name = '2001Q1'
-  AND d1.d_date_sk = ss_sold_date_sk
-  AND i_item_sk = ss_item_sk
-  AND s_store_sk = ss_store_sk
-  AND ss_customer_sk = sr_customer_sk
-  AND ss_item_sk = sr_item_sk
-  AND ss_ticket_number = sr_ticket_number
-  AND sr_returned_date_sk = d2.d_date_sk
-  AND d2.d_quarter_name IN ('2001Q1', '2001Q2', '2001Q3')
-  AND sr_customer_sk = cs_bill_customer_sk
-  AND sr_item_sk = cs_item_sk
-  AND cs_sold_date_sk = d3.d_date_sk
-  AND d3.d_quarter_name IN ('2001Q1', '2001Q2', '2001Q3')
-GROUP BY i_item_id, i_item_desc, s_state
-ORDER BY i_item_id, i_item_desc, s_state
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q18.sql b/submarine-security/spark-security/src/test/resources/tpcds/q18.sql
deleted file mode 100755
index 1214d9b..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q18.sql
+++ /dev/null
@@ -1,43 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  i_item_id,
-  ca_country,
-  ca_state,
-  ca_county,
-  avg(cast(cs_quantity AS DECIMAL(12, 2))) agg1,
-  avg(cast(cs_list_price AS DECIMAL(12, 2))) agg2,
-  avg(cast(cs_coupon_amt AS DECIMAL(12, 2))) agg3,
-  avg(cast(cs_sales_price AS DECIMAL(12, 2))) agg4,
-  avg(cast(cs_net_profit AS DECIMAL(12, 2))) agg5,
-  avg(cast(c_birth_year AS DECIMAL(12, 2))) agg6,
-  avg(cast(cd1.cd_dep_count AS DECIMAL(12, 2))) agg7
-FROM catalog_sales, customer_demographics cd1,
-  customer_demographics cd2, customer, customer_address, date_dim, item
-WHERE cs_sold_date_sk = d_date_sk AND
-  cs_item_sk = i_item_sk AND
-  cs_bill_cdemo_sk = cd1.cd_demo_sk AND
-  cs_bill_customer_sk = c_customer_sk AND
-  cd1.cd_gender = 'F' AND
-  cd1.cd_education_status = 'Unknown' AND
-  c_current_cdemo_sk = cd2.cd_demo_sk AND
-  c_current_addr_sk = ca_address_sk AND
-  c_birth_month IN (1, 6, 8, 9, 12, 2) AND
-  d_year = 1998 AND
-  ca_state IN ('MS', 'IN', 'ND', 'OK', 'NM', 'VA', 'MS')
-GROUP BY ROLLUP (i_item_id, ca_country, ca_state, ca_county)
-ORDER BY ca_country, ca_state, ca_county, i_item_id
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q19.sql b/submarine-security/spark-security/src/test/resources/tpcds/q19.sql
deleted file mode 100755
index 7623d12..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q19.sql
+++ /dev/null
@@ -1,34 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  i_brand_id brand_id,
-  i_brand brand,
-  i_manufact_id,
-  i_manufact,
-  sum(ss_ext_sales_price) ext_price
-FROM date_dim, store_sales, item, customer, customer_address, store
-WHERE d_date_sk = ss_sold_date_sk
-  AND ss_item_sk = i_item_sk
-  AND i_manager_id = 8
-  AND d_moy = 11
-  AND d_year = 1998
-  AND ss_customer_sk = c_customer_sk
-  AND c_current_addr_sk = ca_address_sk
-  AND substr(ca_zip, 1, 5) <> substr(s_zip, 1, 5)
-  AND ss_store_sk = s_store_sk
-GROUP BY i_brand, i_brand_id, i_manufact_id, i_manufact
-ORDER BY ext_price DESC, brand, brand_id, i_manufact_id, i_manufact
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q2.sql b/submarine-security/spark-security/src/test/resources/tpcds/q2.sql
deleted file mode 100755
index b6b07e2..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q2.sql
+++ /dev/null
@@ -1,96 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-WITH wscs AS
-( SELECT
-    sold_date_sk,
-    sales_price
-  FROM (SELECT
-    ws_sold_date_sk sold_date_sk,
-    ws_ext_sales_price sales_price
-  FROM web_sales) x
-  UNION ALL
-  (SELECT
-    cs_sold_date_sk sold_date_sk,
-    cs_ext_sales_price sales_price
-  FROM catalog_sales)),
-    wswscs AS
-  ( SELECT
-    d_week_seq,
-    sum(CASE WHEN (d_day_name = 'Sunday')
-      THEN sales_price
-        ELSE NULL END)
-    sun_sales,
-    sum(CASE WHEN (d_day_name = 'Monday')
-      THEN sales_price
-        ELSE NULL END)
-    mon_sales,
-    sum(CASE WHEN (d_day_name = 'Tuesday')
-      THEN sales_price
-        ELSE NULL END)
-    tue_sales,
-    sum(CASE WHEN (d_day_name = 'Wednesday')
-      THEN sales_price
-        ELSE NULL END)
-    wed_sales,
-    sum(CASE WHEN (d_day_name = 'Thursday')
-      THEN sales_price
-        ELSE NULL END)
-    thu_sales,
-    sum(CASE WHEN (d_day_name = 'Friday')
-      THEN sales_price
-        ELSE NULL END)
-    fri_sales,
-    sum(CASE WHEN (d_day_name = 'Saturday')
-      THEN sales_price
-        ELSE NULL END)
-    sat_sales
-  FROM wscs, date_dim
-  WHERE d_date_sk = sold_date_sk
-  GROUP BY d_week_seq)
-SELECT
-  d_week_seq1,
-  round(sun_sales1 / sun_sales2, 2),
-  round(mon_sales1 / mon_sales2, 2),
-  round(tue_sales1 / tue_sales2, 2),
-  round(wed_sales1 / wed_sales2, 2),
-  round(thu_sales1 / thu_sales2, 2),
-  round(fri_sales1 / fri_sales2, 2),
-  round(sat_sales1 / sat_sales2, 2)
-FROM
-  (SELECT
-    wswscs.d_week_seq d_week_seq1,
-    sun_sales sun_sales1,
-    mon_sales mon_sales1,
-    tue_sales tue_sales1,
-    wed_sales wed_sales1,
-    thu_sales thu_sales1,
-    fri_sales fri_sales1,
-    sat_sales sat_sales1
-  FROM wswscs, date_dim
-  WHERE date_dim.d_week_seq = wswscs.d_week_seq AND d_year = 2001) y,
-  (SELECT
-    wswscs.d_week_seq d_week_seq2,
-    sun_sales sun_sales2,
-    mon_sales mon_sales2,
-    tue_sales tue_sales2,
-    wed_sales wed_sales2,
-    thu_sales thu_sales2,
-    fri_sales fri_sales2,
-    sat_sales sat_sales2
-  FROM wswscs, date_dim
-  WHERE date_dim.d_week_seq = wswscs.d_week_seq AND d_year = 2001 + 1) z
-WHERE d_week_seq1 = d_week_seq2 - 53
-ORDER BY d_week_seq1
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q20.sql b/submarine-security/spark-security/src/test/resources/tpcds/q20.sql
deleted file mode 100755
index bdddee2..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q20.sql
+++ /dev/null
@@ -1,33 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  i_item_desc,
-  i_category,
-  i_class,
-  i_current_price,
-  sum(cs_ext_sales_price) AS itemrevenue,
-  sum(cs_ext_sales_price) * 100 / sum(sum(cs_ext_sales_price))
-  OVER
-  (PARTITION BY i_class) AS revenueratio
-FROM catalog_sales, item, date_dim
-WHERE cs_item_sk = i_item_sk
-  AND i_category IN ('Sports', 'Books', 'Home')
-  AND cs_sold_date_sk = d_date_sk
-  AND d_date BETWEEN cast('1999-02-22' AS DATE)
-AND (cast('1999-02-22' AS DATE) + INTERVAL 30 days)
-GROUP BY i_item_id, i_item_desc, i_category, i_class, i_current_price
-ORDER BY i_category, i_class, i_item_id, i_item_desc, revenueratio
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q21.sql b/submarine-security/spark-security/src/test/resources/tpcds/q21.sql
deleted file mode 100755
index e90b787..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q21.sql
+++ /dev/null
@@ -1,40 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT *
-FROM (
-       SELECT
-         w_warehouse_name,
-         i_item_id,
-         sum(CASE WHEN (cast(d_date AS DATE) < cast('2000-03-11' AS DATE))
-           THEN inv_quantity_on_hand
-             ELSE 0 END) AS inv_before,
-         sum(CASE WHEN (cast(d_date AS DATE) >= cast('2000-03-11' AS DATE))
-           THEN inv_quantity_on_hand
-             ELSE 0 END) AS inv_after
-       FROM inventory, warehouse, item, date_dim
-       WHERE i_current_price BETWEEN 0.99 AND 1.49
-         AND i_item_sk = inv_item_sk
-         AND inv_warehouse_sk = w_warehouse_sk
-         AND inv_date_sk = d_date_sk
-         AND d_date BETWEEN (cast('2000-03-11' AS DATE) - INTERVAL 30 days)
-       AND (cast('2000-03-11' AS DATE) + INTERVAL 30 days)
-       GROUP BY w_warehouse_name, i_item_id) x
-WHERE (CASE WHEN inv_before > 0
-  THEN inv_after / inv_before
-       ELSE NULL
-       END) BETWEEN 2.0 / 3.0 AND 3.0 / 2.0
-ORDER BY w_warehouse_name, i_item_id
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q22.sql b/submarine-security/spark-security/src/test/resources/tpcds/q22.sql
deleted file mode 100755
index b83b124..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q22.sql
+++ /dev/null
@@ -1,29 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-SELECT
-  i_product_name,
-  i_brand,
-  i_class,
-  i_category,
-  avg(inv_quantity_on_hand) qoh
-FROM inventory, date_dim, item, warehouse
-WHERE inv_date_sk = d_date_sk
-  AND inv_item_sk = i_item_sk
-  AND inv_warehouse_sk = w_warehouse_sk
-  AND d_month_seq BETWEEN 1200 AND 1200 + 11
-GROUP BY ROLLUP (i_product_name, i_brand, i_class, i_category)
-ORDER BY qoh, i_product_name, i_brand, i_class, i_category
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q23a.sql b/submarine-security/spark-security/src/test/resources/tpcds/q23a.sql
deleted file mode 100755
index b9c4948..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q23a.sql
+++ /dev/null
@@ -1,68 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-WITH frequent_ss_items AS
-(SELECT
-    substr(i_item_desc, 1, 30) itemdesc,
-    i_item_sk item_sk,
-    d_date solddate,
-    count(*) cnt
-  FROM store_sales, date_dim, item
-  WHERE ss_sold_date_sk = d_date_sk
-    AND ss_item_sk = i_item_sk
-    AND d_year IN (2000, 2000 + 1, 2000 + 2, 2000 + 3)
-  GROUP BY substr(i_item_desc, 1, 30), i_item_sk, d_date
-  HAVING count(*) > 4),
-    max_store_sales AS
-  (SELECT max(csales) tpcds_cmax
-  FROM (SELECT
-    c_customer_sk,
-    sum(ss_quantity * ss_sales_price) csales
-  FROM store_sales, customer, date_dim
-  WHERE ss_customer_sk = c_customer_sk
-    AND ss_sold_date_sk = d_date_sk
-    AND d_year IN (2000, 2000 + 1, 2000 + 2, 2000 + 3)
-  GROUP BY c_customer_sk) x),
-    best_ss_customer AS
-  (SELECT
-    c_customer_sk,
-    sum(ss_quantity * ss_sales_price) ssales
-  FROM store_sales, customer
-  WHERE ss_customer_sk = c_customer_sk
-  GROUP BY c_customer_sk
-  HAVING sum(ss_quantity * ss_sales_price) > (50 / 100.0) *
-    (SELECT *
-    FROM max_store_sales))
-SELECT sum(sales)
-FROM ((SELECT cs_quantity * cs_list_price sales
-FROM catalog_sales, date_dim
-WHERE d_year = 2000
-  AND d_moy = 2
-  AND cs_sold_date_sk = d_date_sk
-  AND cs_item_sk IN (SELECT item_sk
-FROM frequent_ss_items)
-  AND cs_bill_customer_sk IN (SELECT c_customer_sk
-FROM best_ss_customer))
-      UNION ALL
-      (SELECT ws_quantity * ws_list_price sales
-      FROM web_sales, date_dim
-      WHERE d_year = 2000
-        AND d_moy = 2
-        AND ws_sold_date_sk = d_date_sk
-        AND ws_item_sk IN (SELECT item_sk
-      FROM frequent_ss_items)
-        AND ws_bill_customer_sk IN (SELECT c_customer_sk
-      FROM best_ss_customer))) y
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q23b.sql b/submarine-security/spark-security/src/test/resources/tpcds/q23b.sql
deleted file mode 100755
index 7eebb6d..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q23b.sql
+++ /dev/null
@@ -1,83 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-WITH frequent_ss_items AS
-(SELECT
-    substr(i_item_desc, 1, 30) itemdesc,
-    i_item_sk item_sk,
-    d_date solddate,
-    count(*) cnt
-  FROM store_sales, date_dim, item
-  WHERE ss_sold_date_sk = d_date_sk
-    AND ss_item_sk = i_item_sk
-    AND d_year IN (2000, 2000 + 1, 2000 + 2, 2000 + 3)
-  GROUP BY substr(i_item_desc, 1, 30), i_item_sk, d_date
-  HAVING count(*) > 4),
-    max_store_sales AS
-  (SELECT max(csales) tpcds_cmax
-  FROM (SELECT
-    c_customer_sk,
-    sum(ss_quantity * ss_sales_price) csales
-  FROM store_sales, customer, date_dim
-  WHERE ss_customer_sk = c_customer_sk
-    AND ss_sold_date_sk = d_date_sk
-    AND d_year IN (2000, 2000 + 1, 2000 + 2, 2000 + 3)
-  GROUP BY c_customer_sk) x),
-    best_ss_customer AS
-  (SELECT
-    c_customer_sk,
-    sum(ss_quantity * ss_sales_price) ssales
-  FROM store_sales
-    , customer
-  WHERE ss_customer_sk = c_customer_sk
-  GROUP BY c_customer_sk
-  HAVING sum(ss_quantity * ss_sales_price) > (50 / 100.0) *
-    (SELECT *
-    FROM max_store_sales))
-SELECT
-  c_last_name,
-  c_first_name,
-  sales
-FROM ((SELECT
-  c_last_name,
-  c_first_name,
-  sum(cs_quantity * cs_list_price) sales
-FROM catalog_sales, customer, date_dim
-WHERE d_year = 2000
-  AND d_moy = 2
-  AND cs_sold_date_sk = d_date_sk
-  AND cs_item_sk IN (SELECT item_sk
-FROM frequent_ss_items)
-  AND cs_bill_customer_sk IN (SELECT c_customer_sk
-FROM best_ss_customer)
-  AND cs_bill_customer_sk = c_customer_sk
-GROUP BY c_last_name, c_first_name)
-      UNION ALL
-      (SELECT
-        c_last_name,
-        c_first_name,
-        sum(ws_quantity * ws_list_price) sales
-      FROM web_sales, customer, date_dim
-      WHERE d_year = 2000
-        AND d_moy = 2
-        AND ws_sold_date_sk = d_date_sk
-        AND ws_item_sk IN (SELECT item_sk
-      FROM frequent_ss_items)
-        AND ws_bill_customer_sk IN (SELECT c_customer_sk
-      FROM best_ss_customer)
-        AND ws_bill_customer_sk = c_customer_sk
-      GROUP BY c_last_name, c_first_name)) y
-ORDER BY c_last_name, c_first_name, sales
-LIMIT 100
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q24a.sql b/submarine-security/spark-security/src/test/resources/tpcds/q24a.sql
deleted file mode 100755
index 5db7da3..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q24a.sql
+++ /dev/null
@@ -1,49 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
---  the License. You may obtain a copy of the License at
---
---    http://www.apache.org/licenses/LICENSE-2.0
---
---  Unless required by applicable law or agreed to in writing, software
---  distributed under the License is distributed on an "AS IS" BASIS,
---  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
---  See the License for the specific language governing permissions and
---  limitations under the License.
-
-WITH ssales AS
-(SELECT
-    c_last_name,
-    c_first_name,
-    s_store_name,
-    ca_state,
-    s_state,
-    i_color,
-    i_current_price,
-    i_manager_id,
-    i_units,
-    i_size,
-    sum(ss_net_paid) netpaid
-  FROM store_sales, store_returns, store, item, customer, customer_address
-  WHERE ss_ticket_number = sr_ticket_number
-    AND ss_item_sk = sr_item_sk
-    AND ss_customer_sk = c_customer_sk
-    AND ss_item_sk = i_item_sk
-    AND ss_store_sk = s_store_sk
-    AND c_birth_country = upper(ca_country)
-    AND s_zip = ca_zip
-    AND s_market_id = 8
-  GROUP BY c_last_name, c_first_name, s_store_name, ca_state, s_state, i_color,
-    i_current_price, i_manager_id, i_units, i_size)
-SELECT
-  c_last_name,
-  c_first_name,
-  s_store_name,
-  sum(netpaid) paid
-FROM ssales
-WHERE i_color = 'pale'
-GROUP BY c_last_name, c_first_name, s_store_name
-HAVING sum(netpaid) > (SELECT 0.05 * avg(netpaid)
-FROM ssales)
diff --git a/submarine-security/spark-security/src/test/resources/tpcds/q24b.sql b/submarine-security/spark-security/src/test/resources/tpcds/q24b.sql
deleted file mode 100755
index 8a1995c..0000000
--- a/submarine-security/spark-security/src/test/resources/tpcds/q24b.sql
+++ /dev/null
@@ -1,49 +0,0 @@
---  Licensed to the Apache Software Foundation (ASF) under one or more
---  contributor license agreements. See the NOTICE file distributed with
---  this work for additional information regarding copyright ownership.
---  The ASF licenses this file to You under the Apache License, Version 2.0
---  (the "License"); you may not use this file except in compliance with
... 7578 lines suppressed ...

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@submarine.apache.org
For additional commands, e-mail: dev-help@submarine.apache.org