You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "khalidmammadov (via GitHub)" <gi...@apache.org> on 2023/03/05 07:54:10 UTC

[GitHub] [spark] khalidmammadov commented on a diff in pull request #40015: [SPARK-42437][PySpark][Connect] PySpark catalog.cacheTable will allow to specify storage level

khalidmammadov commented on code in PR #40015:
URL: https://github.com/apache/spark/pull/40015#discussion_r1125615556


##########
connector/connect/common/src/main/protobuf/spark/connect/types.proto:
##########
@@ -184,3 +184,15 @@ message DataType {
     DataType sql_type = 5;
   }
 }
+
+enum StorageLevel {

Review Comment:
   @zhengruifeng thanks for review. 
   I have removed enum and static mappings. 
   Now, storage level is resolved based on user input. It allows Python users to use PySpark constants or customise if needed.  This the same logic how PySpark accepts and resolves storage level from user.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org