You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by GitBox <gi...@apache.org> on 2021/03/05 03:28:08 UTC

[GitHub] [kylin] zhengshengjun opened a new pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

zhengshengjun opened a new pull request #1603:
URL: https://github.com/apache/kylin/pull/1603


   ## Proposed changes
   
   Describe the big picture of your changes here to communicate to the maintainers why we should accept this pull request. If it fixes a bug or resolves a feature request, be sure to link to that issue.
   
   ## Types of changes
   
   What types of changes does your code introduce to Kylin?
   _Put an `x` in the boxes that apply_
   
   - [ ] Bugfix (non-breaking change which fixes an issue)
   - [] New feature (non-breaking change which adds functionality)
   - [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
   - [ ] Documentation Update (if none of the other choices apply)
   
   ## Checklist
   
   _Put an `x` in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your code._
   
   - [x] I have create an issue on [Kylin's jira](https://issues.apache.org/jira/browse/KYLIN), and have described the bug/feature there in detail
   - [x] Commit messages in my PR start with the related jira ID, like "KYLIN-0000 Make Kylin project open-source"
   - [x] Compiling and unit tests pass locally with my changes
   - [ ] I have added tests that prove my fix is effective or that my feature works
   - [ ] If this change need a document change, I will prepare another pr against the `document` branch
   - [ ] Any dependent changes have been merged
   
   ## Further comments
   
   If this is a relatively large or complex change, kick off the discussion at user@kylin or dev@kylin by explaining why you chose the solution you did and what alternatives you considered, etc...
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] zzcclp commented on a change in pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

Posted by GitBox <gi...@apache.org>.
zzcclp commented on a change in pull request #1603:
URL: https://github.com/apache/kylin/pull/1603#discussion_r603359932



##########
File path: kylin-spark-project/kylin-spark-common/src/main/scala/org/apache/spark/sql/execution/datasource/ResetShufflePartition.scala
##########
@@ -23,17 +23,14 @@ import org.apache.spark.sql.SparkSession
 import org.apache.spark.utils.SparderUtils
 
 trait ResetShufflePartition extends Logging {
-  val PARTITION_SPLIT_BYTES: Long =
-    KylinConfig.getInstanceFromEnv.getQueryPartitionSplitSizeMB * 1024 * 1024 // 64MB
 
-  def setShufflePartitions(bytes: Long, sparkSession: SparkSession): Unit = {
+  def setShufflePartitions(bytes: Long, sparkSession: SparkSession, conf: KylinConfig): Unit = {
     QueryContextFacade.current().addAndGetSourceScanBytes(bytes)
     val defaultParallelism = SparderUtils.getTotalCore(sparkSession.sparkContext.getConf)
-    val kylinConfig = KylinConfig.getInstanceFromEnv
-    val partitionsNum = if (kylinConfig.getSparkSqlShufflePartitions != -1) {
-      kylinConfig.getSparkSqlShufflePartitions
+    val partitionsNum = if (conf.getSparkSqlShufflePartitions != -1) {
+      conf.getSparkSqlShufflePartitions
     } else {
-      Math.min(QueryContextFacade.current().getSourceScanBytes / PARTITION_SPLIT_BYTES + 1,
+      Math.min(QueryContextFacade.current().getSourceScanBytes / (conf.getQueryPartitionSplitSizeMB * 1024 * 1024) + 1,

Review comment:
       the chars num of this line exceeds 100 chars.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] zzcclp merged pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

Posted by GitBox <gi...@apache.org>.
zzcclp merged pull request #1603:
URL: https://github.com/apache/kylin/pull/1603


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] zzcclp commented on pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

Posted by GitBox <gi...@apache.org>.
zzcclp commented on pull request #1603:
URL: https://github.com/apache/kylin/pull/1603#issuecomment-809447972


   Please merge two commits into one and rebase to branch kylin-on-parquet-v2.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] zzcclp commented on a change in pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

Posted by GitBox <gi...@apache.org>.
zzcclp commented on a change in pull request #1603:
URL: https://github.com/apache/kylin/pull/1603#discussion_r603359932



##########
File path: kylin-spark-project/kylin-spark-common/src/main/scala/org/apache/spark/sql/execution/datasource/ResetShufflePartition.scala
##########
@@ -23,17 +23,14 @@ import org.apache.spark.sql.SparkSession
 import org.apache.spark.utils.SparderUtils
 
 trait ResetShufflePartition extends Logging {
-  val PARTITION_SPLIT_BYTES: Long =
-    KylinConfig.getInstanceFromEnv.getQueryPartitionSplitSizeMB * 1024 * 1024 // 64MB
 
-  def setShufflePartitions(bytes: Long, sparkSession: SparkSession): Unit = {
+  def setShufflePartitions(bytes: Long, sparkSession: SparkSession, conf: KylinConfig): Unit = {
     QueryContextFacade.current().addAndGetSourceScanBytes(bytes)
     val defaultParallelism = SparderUtils.getTotalCore(sparkSession.sparkContext.getConf)
-    val kylinConfig = KylinConfig.getInstanceFromEnv
-    val partitionsNum = if (kylinConfig.getSparkSqlShufflePartitions != -1) {
-      kylinConfig.getSparkSqlShufflePartitions
+    val partitionsNum = if (conf.getSparkSqlShufflePartitions != -1) {
+      conf.getSparkSqlShufflePartitions
     } else {
-      Math.min(QueryContextFacade.current().getSourceScanBytes / PARTITION_SPLIT_BYTES + 1,
+      Math.min(QueryContextFacade.current().getSourceScanBytes / (conf.getQueryPartitionSplitSizeMB * 1024 * 1024) + 1,

Review comment:
       this line will exceed 100 chars.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] zzcclp commented on a change in pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

Posted by GitBox <gi...@apache.org>.
zzcclp commented on a change in pull request #1603:
URL: https://github.com/apache/kylin/pull/1603#discussion_r603356675



##########
File path: kylin-spark-project/kylin-spark-common/src/main/scala/org/apache/spark/sql/execution/datasource/FilePruner.scala
##########
@@ -78,6 +77,8 @@ class FilePruner(cubeInstance: CubeInstance,
                  val options: Map[String, String])
   extends FileIndex with ResetShufflePartition with Logging {
 
+  val MAX_SHARDING_SIZE_PER_TASK: Long = cubeInstance.getConfig.getMaxShardingSizeMBPerTask * 1024 * 1024

Review comment:
       this line will exceed 100 chars, please change to:
   ```
     val MAX_SHARDING_SIZE_PER_TASK: Long =
       cubeInstance.getConfig.getMaxShardingSizeMBPerTask * 1024 * 1024
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] zhengshengjun commented on a change in pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

Posted by GitBox <gi...@apache.org>.
zhengshengjun commented on a change in pull request #1603:
URL: https://github.com/apache/kylin/pull/1603#discussion_r603747164



##########
File path: kylin-spark-project/kylin-spark-common/src/main/scala/org/apache/spark/sql/execution/datasource/FilePruner.scala
##########
@@ -78,6 +77,8 @@ class FilePruner(cubeInstance: CubeInstance,
                  val options: Map[String, String])
   extends FileIndex with ResetShufflePartition with Logging {
 
+  val MAX_SHARDING_SIZE_PER_TASK: Long = cubeInstance.getConfig.getMaxShardingSizeMBPerTask * 1024 * 1024

Review comment:
       done

##########
File path: kylin-spark-project/kylin-spark-common/src/main/scala/org/apache/spark/sql/execution/datasource/ResetShufflePartition.scala
##########
@@ -23,17 +23,14 @@ import org.apache.spark.sql.SparkSession
 import org.apache.spark.utils.SparderUtils
 
 trait ResetShufflePartition extends Logging {
-  val PARTITION_SPLIT_BYTES: Long =
-    KylinConfig.getInstanceFromEnv.getQueryPartitionSplitSizeMB * 1024 * 1024 // 64MB
 
-  def setShufflePartitions(bytes: Long, sparkSession: SparkSession): Unit = {
+  def setShufflePartitions(bytes: Long, sparkSession: SparkSession, conf: KylinConfig): Unit = {
     QueryContextFacade.current().addAndGetSourceScanBytes(bytes)
     val defaultParallelism = SparderUtils.getTotalCore(sparkSession.sparkContext.getConf)
-    val kylinConfig = KylinConfig.getInstanceFromEnv
-    val partitionsNum = if (kylinConfig.getSparkSqlShufflePartitions != -1) {
-      kylinConfig.getSparkSqlShufflePartitions
+    val partitionsNum = if (conf.getSparkSqlShufflePartitions != -1) {
+      conf.getSparkSqlShufflePartitions
     } else {
-      Math.min(QueryContextFacade.current().getSourceScanBytes / PARTITION_SPLIT_BYTES + 1,
+      Math.min(QueryContextFacade.current().getSourceScanBytes / (conf.getQueryPartitionSplitSizeMB * 1024 * 1024) + 1,

Review comment:
       done




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] zzcclp commented on a change in pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

Posted by GitBox <gi...@apache.org>.
zzcclp commented on a change in pull request #1603:
URL: https://github.com/apache/kylin/pull/1603#discussion_r603356675



##########
File path: kylin-spark-project/kylin-spark-common/src/main/scala/org/apache/spark/sql/execution/datasource/FilePruner.scala
##########
@@ -78,6 +77,8 @@ class FilePruner(cubeInstance: CubeInstance,
                  val options: Map[String, String])
   extends FileIndex with ResetShufflePartition with Logging {
 
+  val MAX_SHARDING_SIZE_PER_TASK: Long = cubeInstance.getConfig.getMaxShardingSizeMBPerTask * 1024 * 1024

Review comment:
       the chars num of this line exceeds 100 chars, please change to:
   ```
     val MAX_SHARDING_SIZE_PER_TASK: Long =
       cubeInstance.getConfig.getMaxShardingSizeMBPerTask * 1024 * 1024
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] zzcclp commented on pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

Posted by GitBox <gi...@apache.org>.
zzcclp commented on pull request #1603:
URL: https://github.com/apache/kylin/pull/1603#issuecomment-809447064


   LGTM except for two comments.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [kylin] codecov-io commented on pull request #1603: KYLIN-4918 Support Cube Level configuration in FilePruner

Posted by GitBox <gi...@apache.org>.
codecov-io commented on pull request #1603:
URL: https://github.com/apache/kylin/pull/1603#issuecomment-791136157


   # [Codecov](https://codecov.io/gh/apache/kylin/pull/1603?src=pr&el=h1) Report
   > :exclamation: No coverage uploaded for pull request base (`kylin-on-parquet-v2@01036af`). [Click here to learn what that means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/kylin/pull/1603/graphs/tree.svg?width=650&height=150&src=pr&token=JawVgbgsVo)](https://codecov.io/gh/apache/kylin/pull/1603?src=pr&el=tree)
   
   ```diff
   @@                  Coverage Diff                   @@
   ##             kylin-on-parquet-v2    #1603   +/-   ##
   ======================================================
     Coverage                       ?   24.34%           
     Complexity                     ?     4649           
   ======================================================
     Files                          ?     1145           
     Lines                          ?    65365           
     Branches                       ?     9601           
   ======================================================
     Hits                           ?    15913           
     Misses                         ?    47787           
     Partials                       ?     1665           
   ```
   
   
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/kylin/pull/1603?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/kylin/pull/1603?src=pr&el=footer). Last update [01036af...d8b4c20](https://codecov.io/gh/apache/kylin/pull/1603?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org