You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "rangadi (via GitHub)" <gi...@apache.org> on 2023/04/27 22:00:57 UTC

[GitHub] [spark] rangadi commented on a diff in pull request #40983: [SPARK-43312][PROTOBUF] Option to convert Any fields into JSON

rangadi commented on code in PR #40983:
URL: https://github.com/apache/spark/pull/40983#discussion_r1179717089


##########
connector/protobuf/src/main/scala/org/apache/spark/sql/protobuf/utils/ProtobufOptions.scala:
##########
@@ -39,13 +39,67 @@ private[sql] class ProtobufOptions(
   val parseMode: ParseMode =
     parameters.get("mode").map(ParseMode.fromString).getOrElse(FailFastMode)
 
-  // Setting the `recursive.fields.max.depth` to 1 allows it to be recurse once,
-  // and 2 allows it to be recursed twice and so on. A value of `recursive.fields.max.depth`
-  // greater than 10 is not permitted. If it is not  specified, the default value is -1;
-  // A value of 0 or below disallows any recursive fields. If a protobuf
-  // record has more depth than the allowed value for recursive fields, it will be truncated
-  // and corresponding fields are ignored (dropped).
+  /**
+   * Adds support for recursive fields. If this option is is not specified, recursive fields are

Review Comment:
   Though not related to this PR, I expanded the documentation for `recursive.fields.max.depth` with clarifying example.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org