You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by HyukjinKwon <gi...@git.apache.org> on 2018/08/28 03:55:19 UTC

[GitHub] spark pull request #22226: [SPARK-25252][SQL] Support arrays of any types by...

Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22226#discussion_r213177790
  
    --- Diff: python/pyspark/sql/functions.py ---
    @@ -2289,12 +2289,10 @@ def from_json(col, schema, options={}):
     @since(2.1)
     def to_json(col, options={}):
         """
    -    Converts a column containing a :class:`StructType`, :class:`ArrayType` of
    -    :class:`StructType`\\s, a :class:`MapType` or :class:`ArrayType` of :class:`MapType`\\s
    +    Converts a column containing a :class:`StructType`, :class:`ArrayType` or a :class:`MapType`
         into a JSON string. Throws an exception, in the case of an unsupported type.
     
    -    :param col: name of column containing the struct, array of the structs, the map or
    -        array of the maps.
    +    :param col: name of column containing a struct, an array or a map.
         :param options: options to control converting. accepts the same options as the json datasource
    --- End diff --
    
    ditto


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org