You are viewing a plain text version of this content. The canonical link for it is here.
Posted to notifications@kyuubi.apache.org by GitBox <gi...@apache.org> on 2022/08/19 11:22:31 UTC

[GitHub] [incubator-kyuubi] yikf opened a new pull request, #3283: Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

yikf opened a new pull request, #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283

   
   <!--
   Thanks for sending a pull request!
   
   Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://kyuubi.readthedocs.io/en/latest/community/contributions.html
     2. If the PR is related to an issue in https://github.com/apache/incubator-kyuubi/issues, add '[KYUUBI #XXXX]' in your PR title, e.g., '[KYUUBI #XXXX] Your PR title ...'.
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][KYUUBI #XXXX] Your PR title ...'.
   -->
   
   ### _Why are the changes needed?_
   <!--
   Please clarify why the changes are needed. For instance,
     1. If you add a feature, you can talk about the use case of it.
     2. If you fix a bug, you can clarify why it is a bug.
   -->
   fix https://github.com/apache/incubator-kyuubi/issues/3276
   
   ### _How was this patch tested?_
   - [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible
   
   - [ ] Add screenshots for manual tests if appropriate
   
   - [ ] [Run test](https://kyuubi.apache.org/docs/latest/develop_tools/testing.html#running-tests) locally before make a pull request
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yikf commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yikf commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950158558


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   I tested different Spark version(2.4.0、3.0.0、3.1.3、3.2.2、3.3.0) with a simple script, the conclusion is that:
   
   - all versions compile normally except 2.4;
   - and the cause of 2.4 compile failure has nothing to do with this PR(fear is premature failure, leafing through the Spark code to find should be irrelevant, see: [spark2.4 code](https://github.com/apache/spark/blob/4be566062defa249435c4d72eb106fe7b933e023/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/InsertIntoHadoopFsRelationCommand.scala#L57) (But this failure seems to be related to https://github.com/apache/incubator-kyuubi/pull/3262 , will authZ support Spark 2.4?)
   
   **script:**
   ```
   #!/bin/bsh
   
   versions=("2.4.0" "3.0.0" "3.1.3" "3.2.2" "3.3.0")
   
   last=3.2.2
   for e in ${versions[@]}; do
   	#statements
   	echo "Building Spark version $e"
   
   	sed -i "" "s/\<spark\.version\>$last\<\/spark\.version\>/\<spark\.version\>$e\<\/spark\.version\>/g" /Users/yikaifei/code/kyuubi/pom.xml
   	last=$e
   
   	./build/mvn clean package -DskipTests -T 4 -pl 'extensions/spark/kyuubi-spark-authz' -am -o
   done
   
   
   sed -i "" "s/\<spark\.version\>3.3.0\<\/spark\.version\>/\<spark\.version\>3.2.2\<\/spark\.version\>/g" /Users/yikaifei/code/kyuubi/pom.xml
   ```
   **result:**
   ```
   yikaifei@B000000335049P kyuubi % sh Muti-Spark-Auth.sh
   Building Spark version 2.4.0
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:25: object TreeNodeTag is not a member of package org.apache.spark.sql.catalyst.trees
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:44: not found: value TreeNodeTag
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:34: value getTagValue is not a member of org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:36: value setTagValue is not a member of org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
   [ERROR] four errors found
   [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:4.6.1:compile (scala-compile-first) on project kyuubi-spark-authz_2.12: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:4.6.1:compile failed: Compilation failed: InterfaceCompileFailed -> [Help 1]
   [ERROR]
   [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
   [ERROR] Re-run Maven using the -X switch to enable full debug logging.
   [ERROR]
   [ERROR] For more information about the errors and possible solutions, please read the following articles:
   [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
   [ERROR]
   [ERROR] After correcting the problems, you can resume the build with the command
   [ERROR]   mvn <args> -rf :kyuubi-spark-authz_2.12
   Building Spark version 3.0.0
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   Building Spark version 3.1.3
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   Building Spark version 3.2.2
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   Building Spark version 3.3.0
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   ```
   
   
   
   BTW:  there seems to be a problem unrelated to this PR when `./build/mvn clean package -DskipTests -T 4 -Pspark-3.1,spark-3.2,spark-3.3`
   <img width="1436" alt="image" src="https://user-images.githubusercontent.com/51110188/185613310-548c16b3-5975-4f94-8456-f4e80346fee3.png">
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yaooqinn commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yaooqinn commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950113465


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   yes



##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   and maybe 2.4 and 3.0



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yikf commented on pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yikf commented on PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#issuecomment-1221989744

   Thanks @pan3793 and @yaooqinn for review, and PR description has been updated aslo.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yikf commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yikf commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950158558


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   I tested different Spark version(2.4.0、3.0.0、3.1.3、3.2.2、3.3.0) with a simple script, the conclusion is that:
   
   - all versions compile normally except 2.4;
   - and the cause of 2.4 compile failure has nothing to do with this PR(fear is premature failure, leafing through the Spark code to find should be irrelevant, see: [spark2.4 code](https://github.com/apache/spark/blob/4be566062defa249435c4d72eb106fe7b933e023/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/InsertIntoHadoopFsRelationCommand.scala#L57)
   
   **script:**
   ```
   #!/bin/bsh
   
   versions=("2.4.0" "3.0.0" "3.1.3" "3.2.2" "3.3.0")
   
   last=3.2.2
   for e in ${versions[@]}; do
   	#statements
   	echo "Building Spark version $e"
   
   	sed -i "" "s/\<spark\.version\>$last\<\/spark\.version\>/\<spark\.version\>$e\<\/spark\.version\>/g" /Users/yikaifei/code/kyuubi/pom.xml
   	last=$e
   
   	./build/mvn clean package -DskipTests -T 4 -pl 'extensions/spark/kyuubi-spark-authz' -am -o
   done
   
   
   sed -i "" "s/\<spark\.version\>3.3.0\<\/spark\.version\>/\<spark\.version\>3.2.2\<\/spark\.version\>/g" /Users/yikaifei/code/kyuubi/pom.xml
   ```
   **result:**
   ```
   yikaifei@B000000335049P kyuubi % sh Muti-Spark-Auth.sh
   Building Spark version 2.4.0
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:25: object TreeNodeTag is not a member of package org.apache.spark.sql.catalyst.trees
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:44: not found: value TreeNodeTag
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:34: value getTagValue is not a member of org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:36: value setTagValue is not a member of org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
   [ERROR] four errors found
   [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:4.6.1:compile (scala-compile-first) on project kyuubi-spark-authz_2.12: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:4.6.1:compile failed: Compilation failed: InterfaceCompileFailed -> [Help 1]
   [ERROR]
   [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
   [ERROR] Re-run Maven using the -X switch to enable full debug logging.
   [ERROR]
   [ERROR] For more information about the errors and possible solutions, please read the following articles:
   [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
   [ERROR]
   [ERROR] After correcting the problems, you can resume the build with the command
   [ERROR]   mvn <args> -rf :kyuubi-spark-authz_2.12
   Building Spark version 3.0.0
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   Building Spark version 3.1.3
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   Building Spark version 3.2.2
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   Building Spark version 3.3.0
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   ```
   
   
   
   BTW:  there seems to be a problem unrelated to this PR when `./build/mvn clean package -DskipTests -T 4 -Pspark-3.1,spark-3.2,spark-3.3`
   <img width="1436" alt="image" src="https://user-images.githubusercontent.com/51110188/185613310-548c16b3-5975-4f94-8456-f4e80346fee3.png">
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yaooqinn commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yaooqinn commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950098811


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |

Review Comment:
   Seems not very necessary, the storage layer often has ACLs of its own.
   
   But, we can do some research first with hive and presto behavior, and make the decision and PR later
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] pan3793 commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
pan3793 commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950177571


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   BTW, the maven properties can be overrided by `-Dk=v`, e.g. 
   ```
   build/mvn clean install -DskipTests -Pspark-3.1 -Dspark.version=3.1.1
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] codecov-commenter commented on pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
codecov-commenter commented on PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#issuecomment-1221918422

   # [Codecov](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) Report
   > Merging [#3283](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) (428eae9) into [master](https://codecov.io/gh/apache/incubator-kyuubi/commit/f72d1b7d9d6bfa95d5bf8a8214978b6f25023fe5?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) (f72d1b7) will **decrease** coverage by `0.06%`.
   > The diff coverage is `77.77%`.
   
   > :exclamation: Current head 428eae9 differs from pull request most recent head 958b038. Consider uploading reports for the commit 958b038 to get more accurate results
   
   ```diff
   @@             Coverage Diff              @@
   ##             master    #3283      +/-   ##
   ============================================
   - Coverage     51.37%   51.31%   -0.07%     
     Complexity       13       13              
   ============================================
     Files           469      468       -1     
     Lines         26229    26227       -2     
     Branches       3630     3633       +3     
   ============================================
   - Hits          13476    13458      -18     
   - Misses        11469    11484      +15     
   - Partials       1284     1285       +1     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) | Coverage Δ | |
   |---|---|---|
   | [.../kyuubi/plugin/spark/authz/PrivilegesBuilder.scala](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-ZXh0ZW5zaW9ucy9zcGFyay9reXV1Ymktc3BhcmstYXV0aHovc3JjL21haW4vc2NhbGEvb3JnL2FwYWNoZS9reXV1YmkvcGx1Z2luL3NwYXJrL2F1dGh6L1ByaXZpbGVnZXNCdWlsZGVyLnNjYWxh) | `69.41% <77.77%> (+0.04%)` | :arrow_up: |
   | [...client/exception/RetryableKyuubiRestException.java](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-a3l1dWJpLXJlc3QtY2xpZW50L3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9reXV1YmkvY2xpZW50L2V4Y2VwdGlvbi9SZXRyeWFibGVLeXV1YmlSZXN0RXhjZXB0aW9uLmphdmE=) | `0.00% <0.00%> (-100.00%)` | :arrow_down: |
   | [.../org/apache/kyuubi/client/RetryableRestClient.java](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-a3l1dWJpLXJlc3QtY2xpZW50L3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9reXV1YmkvY2xpZW50L1JldHJ5YWJsZVJlc3RDbGllbnQuamF2YQ==) | `48.78% <0.00%> (-24.40%)` | :arrow_down: |
   | [...main/java/org/apache/kyuubi/client/RestClient.java](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-a3l1dWJpLXJlc3QtY2xpZW50L3NyYy9tYWluL2phdmEvb3JnL2FwYWNoZS9reXV1YmkvY2xpZW50L1Jlc3RDbGllbnQuamF2YQ==) | `82.75% <0.00%> (-3.45%)` | :arrow_down: |
   | [...yuubi/server/metadata/jdbc/JDBCMetadataStore.scala](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-a3l1dWJpLXNlcnZlci9zcmMvbWFpbi9zY2FsYS9vcmcvYXBhY2hlL2t5dXViaS9zZXJ2ZXIvbWV0YWRhdGEvamRiYy9KREJDTWV0YWRhdGFTdG9yZS5zY2FsYQ==) | `89.27% <0.00%> (-0.70%)` | :arrow_down: |
   | [...n/scala/org/apache/kyuubi/engine/ProcBuilder.scala](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-a3l1dWJpLXNlcnZlci9zcmMvbWFpbi9zY2FsYS9vcmcvYXBhY2hlL2t5dXViaS9lbmdpbmUvUHJvY0J1aWxkZXIuc2NhbGE=) | `82.60% <0.00%> (-0.63%)` | :arrow_down: |
   | [...org/apache/kyuubi/operation/ExecuteStatement.scala](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-a3l1dWJpLXNlcnZlci9zcmMvbWFpbi9zY2FsYS9vcmcvYXBhY2hlL2t5dXViaS9vcGVyYXRpb24vRXhlY3V0ZVN0YXRlbWVudC5zY2FsYQ==) | `80.51% <0.00%> (ø)` | |
   | [...ache/kyuubi/operation/KyuubiOperationManager.scala](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-a3l1dWJpLXNlcnZlci9zcmMvbWFpbi9zY2FsYS9vcmcvYXBhY2hlL2t5dXViaS9vcGVyYXRpb24vS3l1dWJpT3BlcmF0aW9uTWFuYWdlci5zY2FsYQ==) | `82.19% <0.00%> (ø)` | |
   | [.../main/scala/org/apache/kyuubi/util/JdbcUtils.scala](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-a3l1dWJpLWNvbW1vbi9zcmMvbWFpbi9zY2FsYS9vcmcvYXBhY2hlL2t5dXViaS91dGlsL0pkYmNVdGlscy5zY2FsYQ==) | | |
   | ... and [1 more](https://codecov.io/gh/apache/incubator-kyuubi/pull/3283/diff?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) | |
   
   :mega: We’re building smart automated test selection to slash your CI/CD build times. [Learn more](https://about.codecov.io/iterative-testing/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yikf commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yikf commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950109379


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   Other versions means spark3.1/spark3.2/spark3.3?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yikf commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yikf commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950200806


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   oh, yea ☺, thanks~



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yaooqinn commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yaooqinn commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950094785


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   Can you verify if there are any compilation errors for other spark versions, and attach the result in the PR description



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yaooqinn commented on pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yaooqinn commented on PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#issuecomment-1221958887

   Second to @pan3793 's comments, otherwise LGTM


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yaooqinn closed pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yaooqinn closed pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand
URL: https://github.com/apache/incubator-kyuubi/pull/3283


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yaooqinn commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yaooqinn commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950108009


##########
extensions/spark/kyuubi-spark-authz/src/test/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtensionSuite.scala:
##########
@@ -61,6 +61,43 @@ abstract class RangerSparkExtensionSuite extends AnyFunSuite
     s"Permission denied: user [$user] does not have [$privilege] privilege on [$resource]"
   }
 
+  test("[KYUUBI #3276] InsertIntoHadoopFsRelationCommand: Write tables should also be checked ") {
+    val db = "db"
+    val createDB = s"CREATE DATABASE IF NOT EXISTS $db"
+    val table = "employee"
+    val createTable = s"CREATE TABLE IF NOT EXISTS $db.$table (id string) USING PARQUET"
+    val insert = s"INSERT INTO $db.$table values('id1')"
+
+    try {
+      doAs(
+        "admin", {
+          sql(createDB)
+          sql(createTable)
+
+          // Command may be executed early, so it should be CommandResult, however,
+          // this does not affect the original `Command` authentication, this is due to
+          // `eagerlyExecuteCommands` first plans the original command, this includes the
+          // Optimizer phase (i.e., authenticates the original command as well).
+          val optimizedPlan = sql(insert).queryExecution.optimizedPlan
+          assert(optimizedPlan.isInstanceOf[CommandResult])
+          assert(optimizedPlan.asInstanceOf[CommandResult].commandLogicalPlan.nodeName
+            == "InsertIntoHadoopFsRelationCommand")
+        })
+
+      doAs(
+        "yi", {
+          val e = intercept[RuntimeException](sql(insert).count())
+          assert(e.getMessage === errorMessage("update", s"$db/$table"))
+        })
+    } finally {

Review Comment:
   not related to this PR, it's time to make a helper method for this try-finally block.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yikf commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yikf commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950109570


##########
extensions/spark/kyuubi-spark-authz/src/test/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtensionSuite.scala:
##########
@@ -61,6 +61,43 @@ abstract class RangerSparkExtensionSuite extends AnyFunSuite
     s"Permission denied: user [$user] does not have [$privilege] privilege on [$resource]"
   }
 
+  test("[KYUUBI #3276] InsertIntoHadoopFsRelationCommand: Write tables should also be checked ") {
+    val db = "db"
+    val createDB = s"CREATE DATABASE IF NOT EXISTS $db"
+    val table = "employee"
+    val createTable = s"CREATE TABLE IF NOT EXISTS $db.$table (id string) USING PARQUET"
+    val insert = s"INSERT INTO $db.$table values('id1')"
+
+    try {
+      doAs(
+        "admin", {
+          sql(createDB)
+          sql(createTable)
+
+          // Command may be executed early, so it should be CommandResult, however,
+          // this does not affect the original `Command` authentication, this is due to
+          // `eagerlyExecuteCommands` first plans the original command, this includes the
+          // Optimizer phase (i.e., authenticates the original command as well).
+          val optimizedPlan = sql(insert).queryExecution.optimizedPlan
+          assert(optimizedPlan.isInstanceOf[CommandResult])
+          assert(optimizedPlan.asInstanceOf[CommandResult].commandLogicalPlan.nodeName
+            == "InsertIntoHadoopFsRelationCommand")
+        })
+
+      doAs(
+        "yi", {
+          val e = intercept[RuntimeException](sql(insert).count())
+          assert(e.getMessage === errorMessage("update", s"$db/$table"))
+        })
+    } finally {

Review Comment:
   Good idea, will submit new pr for this.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] pan3793 commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
pan3793 commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950177571


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   BTW, the maven properties can override by `-Dk=v`, e.g. 
   ```
   build/mvn clean install -DskipTests -Pspark-3.1 -Dspark.version=3.1.1
   ```



##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   BTW, the maven properties can by overrided by `-Dk=v`, e.g. 
   ```
   build/mvn clean install -DskipTests -Pspark-3.1 -Dspark.version=3.1.1
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yikf commented on a diff in pull request #3283: Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yikf commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950087237


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |

Review Comment:
   Shall we check user's permission to output path? cc @yaooqinn 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yikf commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yikf commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950168911


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   > BTW: there seems to be a problem unrelated to this PR when `./build/mvn clean package -DskipTests -T 4 -Pspark-3.1,spark-3.2,spark-3.3`
   
   If it is confirmed to be a problem, i will fix two points:
   1. Why didn't GA find this
   2. fix these errors
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] pan3793 commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
pan3793 commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950175355


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   > `-Pspark-3.1,spark-3.2,spark-3.3`
   
   we don't support it because the shared common module compiled only once and it maybe not compatiable w/ echo version.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] pan3793 commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
pan3793 commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950175355


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   > `-Pspark-3.1,spark-3.2,spark-3.3`
   
   we don't support it because the shared common module compiled only once and it maybe not compatiable w/ each version.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org


[GitHub] [incubator-kyuubi] yikf commented on a diff in pull request #3283: [KYUUBI #3276][AUTHZ] Should check permission of the writing table If the plan is InsertIntoHadoopFsRelationCommand

Posted by GitBox <gi...@apache.org>.
yikf commented on code in PR #3283:
URL: https://github.com/apache/incubator-kyuubi/pull/3283#discussion_r950158558


##########
extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/PrivilegesBuilder.scala:
##########
@@ -401,12 +401,22 @@ object PrivilegesBuilder {
         buildQuery(getQuery, inputObjs)
 
       case "InsertIntoDataSourceDirCommand" |
-          "SaveIntoDataSourceCommand" |
-          "InsertIntoHadoopFsRelationCommand" |
           "InsertIntoHiveDirCommand" =>
+        buildQuery(getQuery, inputObjs)
+
+      case "SaveIntoDataSourceCommand" =>
         // TODO: Should get the table via datasource options?
         buildQuery(getQuery, inputObjs)
 
+      case "InsertIntoHadoopFsRelationCommand" =>
+        getPlanField[Option[CatalogTable]]("catalogTable").map(opt => {

Review Comment:
   I tested different Spark version(2.4.0、3.0.0、3.1.3、3.2.2、3.3.0) with a simple script, the conclusion is that:
   
   - all versions compile normally except 2.4;
   - and the cause of 2.4 compile failure has nothing to do with this PR(fear is premature failure, leafing through the Spark code to find should be irrelevant, see: [spark2.4 code](https://github.com/apache/spark/blob/4be566062defa249435c4d72eb106fe7b933e023/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/InsertIntoHadoopFsRelationCommand.scala#L57) (But this failure seems to be related to https://github.com/apache/incubator-kyuubi/pull/3262 , will file a followup pr to fix)
   
   **script:**
   ```
   #!/bin/bsh
   
   versions=("2.4.0" "3.0.0" "3.1.3" "3.2.2" "3.3.0")
   
   last=3.2.2
   for e in ${versions[@]}; do
   	#statements
   	echo "Building Spark version $e"
   
   	sed -i "" "s/\<spark\.version\>$last\<\/spark\.version\>/\<spark\.version\>$e\<\/spark\.version\>/g" /Users/yikaifei/code/kyuubi/pom.xml
   	last=$e
   
   	./build/mvn clean package -DskipTests -T 4 -pl 'extensions/spark/kyuubi-spark-authz' -am -o
   done
   
   
   sed -i "" "s/\<spark\.version\>3.3.0\<\/spark\.version\>/\<spark\.version\>3.2.2\<\/spark\.version\>/g" /Users/yikaifei/code/kyuubi/pom.xml
   ```
   **result:**
   ```
   yikaifei@B000000335049P kyuubi % sh Muti-Spark-Auth.sh
   Building Spark version 2.4.0
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:25: object TreeNodeTag is not a member of package org.apache.spark.sql.catalyst.trees
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:44: not found: value TreeNodeTag
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:34: value getTagValue is not a member of org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
   [ERROR] /Users/yikaifei/code/kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RuleAuthorization.scala:36: value setTagValue is not a member of org.apache.spark.sql.catalyst.plans.logical.LogicalPlan
   [ERROR] four errors found
   [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:4.6.1:compile (scala-compile-first) on project kyuubi-spark-authz_2.12: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:4.6.1:compile failed: Compilation failed: InterfaceCompileFailed -> [Help 1]
   [ERROR]
   [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
   [ERROR] Re-run Maven using the -X switch to enable full debug logging.
   [ERROR]
   [ERROR] For more information about the errors and possible solutions, please read the following articles:
   [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
   [ERROR]
   [ERROR] After correcting the problems, you can resume the build with the command
   [ERROR]   mvn <args> -rf :kyuubi-spark-authz_2.12
   Building Spark version 3.0.0
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   Building Spark version 3.1.3
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   Building Spark version 3.2.2
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   Building Spark version 3.3.0
   Using `mvn` from path: /Users/yikaifei/code/kyuubi/build/apache-maven-3.8.4/bin/mvn
   ```
   
   
   
   BTW:  there seems to be a problem unrelated to this PR when `./build/mvn clean package -DskipTests -T 4 -Pspark-3.1,spark-3.2,spark-3.3`
   <img width="1436" alt="image" src="https://user-images.githubusercontent.com/51110188/185613310-548c16b3-5975-4f94-8456-f4e80346fee3.png">
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: notifications-unsubscribe@kyuubi.apache.org
For additional commands, e-mail: notifications-help@kyuubi.apache.org