You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/10/04 13:22:21 UTC

[GitHub] [spark] HyukjinKwon commented on a change in pull request #26013: [SPARK-29347][SQL] Add JSON serialization for external Rows

HyukjinKwon commented on a change in pull request #26013: [SPARK-29347][SQL] Add JSON serialization for external Rows
URL: https://github.com/apache/spark/pull/26013#discussion_r331497513
 
 

 ##########
 File path: sql/catalyst/src/main/scala/org/apache/spark/sql/Row.scala
 ##########
 @@ -501,4 +513,88 @@ trait Row extends Serializable {
   private def getAnyValAs[T <: AnyVal](i: Int): T =
     if (isNullAt(i)) throw new NullPointerException(s"Value at index $i is null")
     else getAs[T](i)
+
+  /** The compact JSON representation of this row. */
+  def json: String = compact(jsonValue)
 
 Review comment:
   Hm, this API looks already pretty slow though, and I suspect this API should be called in a critical path .. ?
   If it's supposed to be used in a critical path, we might rather have to provide a API to make a convert function given schema (so that we avoid type dispatch for every row).
   
   One rather minor concern is that the JSON representation for a row seems different comparing to JSON datasource. e.g.)  https://github.com/apache/spark/pull/26013/files#r331463832 and https://github.com/apache/spark/pull/26013/files#diff-78ce4e47d137bbb0d4350ad732b48d5bR576-R578
   
   and here a bit duplicates the codes ..
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org