You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by li...@apache.org on 2017/08/05 17:11:02 UTC

[2/2] spark git commit: [SPARK-21485][FOLLOWUP][SQL][DOCS] Describes examples and arguments separately, and note/since in SQL built-in function documentation

[SPARK-21485][FOLLOWUP][SQL][DOCS] Describes examples and arguments separately, and note/since in SQL built-in function documentation

## What changes were proposed in this pull request?

This PR proposes to separate `extended` into `examples` and `arguments` internally so that both can be separately documented and add `since` and `note` for additional information.

For `since`, it looks users sometimes get confused by, up to my knowledge, missing version information. For example, see https://www.mail-archive.com/userspark.apache.org/msg64798.html

For few good examples to check the built documentation, please see both:
`from_json` - https://spark-test.github.io/sparksqldoc/#from_json
`like` - https://spark-test.github.io/sparksqldoc/#like

For `DESCRIBE FUNCTION`, `note` and `since` are added as below:

```
> DESCRIBE FUNCTION EXTENDED rlike;
...
Extended Usage:
    Arguments:
      ...

    Examples:
      ...

    Note:
      Use LIKE to match with simple string pattern
```

```
> DESCRIBE FUNCTION EXTENDED to_json;
...
    Examples:
      ...

    Since: 2.2.0
```

For the complete documentation, see https://spark-test.github.io/sparksqldoc/

## How was this patch tested?

Manual tests and existing tests. Please see https://spark-test.github.io/sparksqldoc

Jenkins tests are needed to double check

Author: hyukjinkwon <gu...@gmail.com>

Closes #18749 from HyukjinKwon/followup-sql-doc-gen.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/ba327ee5
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/ba327ee5
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/ba327ee5

Branch: refs/heads/master
Commit: ba327ee54c32b11107793604895bd38559804858
Parents: 3a45c7f
Author: hyukjinkwon <gu...@gmail.com>
Authored: Sat Aug 5 10:10:56 2017 -0700
Committer: gatorsmile <ga...@gmail.com>
Committed: Sat Aug 5 10:10:56 2017 -0700

----------------------------------------------------------------------
 .../expressions/ExpressionDescription.java      |  42 +++++-
 .../catalyst/expressions/ExpressionInfo.java    |  65 ++++++++-
 .../catalyst/analysis/FunctionRegistry.scala    |  20 ++-
 .../expressions/CallMethodViaReflection.scala   |   2 +-
 .../spark/sql/catalyst/expressions/Cast.scala   |   2 +-
 .../aggregate/ApproximatePercentile.scala       |   2 +-
 .../sql/catalyst/expressions/arithmetic.scala   |  20 +--
 .../expressions/bitwiseExpressions.scala        |   8 +-
 .../expressions/collectionOperations.scala      |  10 +-
 .../expressions/complexTypeCreator.scala        |  11 +-
 .../expressions/conditionalExpressions.scala    |   2 +-
 .../expressions/datetimeExpressions.scala       |  52 +++----
 .../sql/catalyst/expressions/generators.scala   |   8 +-
 .../spark/sql/catalyst/expressions/hash.scala   |  10 +-
 .../catalyst/expressions/jsonExpressions.scala  |  14 +-
 .../catalyst/expressions/mathExpressions.scala  |  80 +++++------
 .../spark/sql/catalyst/expressions/misc.scala   |   6 +-
 .../catalyst/expressions/nullExpressions.scala  |  18 +--
 .../expressions/randomExpressions.scala         |   4 +-
 .../expressions/regexpExpressions.scala         |  69 ++++-----
 .../expressions/stringExpressions.scala         |  87 ++++++------
 .../sql/catalyst/expressions/xml/xpath.scala    |  16 +--
 .../resources/sql-tests/results/cast.sql.out    |   4 +-
 .../sql-tests/results/json-functions.sql.out    |   4 +
 sql/gen-sql-markdown.py                         | 142 ++++++++++++++++---
 25 files changed, 461 insertions(+), 237 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionDescription.java
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionDescription.java b/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionDescription.java
index 62a2ce4..ea6fffa 100644
--- a/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionDescription.java
+++ b/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionDescription.java
@@ -24,20 +24,50 @@ import java.lang.annotation.RetentionPolicy;
 
 /**
  * ::DeveloperApi::
-
+ *
  * A function description type which can be recognized by FunctionRegistry, and will be used to
  * show the usage of the function in human language.
  *
  * `usage()` will be used for the function usage in brief way.
- * `extended()` will be used for the function usage in verbose way, suppose
- *              an example will be provided.
  *
- *  And we can refer the function name by `_FUNC_`, in `usage` and `extended`, as it's
+ * These below are concatenated and used for the function usage in verbose way, suppose arguments,
+ * examples, note and since will be provided.
+ *
+ * `arguments()` describes arguments for the expression. This should follow the format as below:
+ *
+ *   Arguments:
+ *     * arg0 - ...
+ *         ....
+ *     * arg1 - ...
+ *         ....
+ *
+ * `examples()` describes examples for the expression. This should follow the format as below:
+ *
+ *   Examples:
+ *     > SELECT ...;
+ *      ...
+ *     > SELECT ...;
+ *      ...
+ *
+ * `note()` contains some notes for the expression optionally.
+ *
+ * `since()` contains version information for the expression. Version is specified by,
+ * for example, "2.2.0".
+ *
+ *  We can refer the function name by `_FUNC_`, in `usage`, `arguments` and `examples`, as it's
  *  registered in `FunctionRegistry`.
+ *
+ *  Note that, if `extended()` is defined, `arguments()`, `examples()`, `note()` and `since()` will
+ *  be ignored and `extended()` will be used for the extended description for backward
+ *  compatibility.
  */
 @DeveloperApi
 @Retention(RetentionPolicy.RUNTIME)
 public @interface ExpressionDescription {
-    String usage() default "_FUNC_ is undocumented";
-    String extended() default "\n    No example/argument for _FUNC_.\n";
+    String usage() default "";
+    String extended() default "";
+    String arguments() default "";
+    String examples() default "";
+    String note() default "";
+    String since() default "";
 }

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java b/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java
index 4565ed4..ab13ac9 100644
--- a/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java
+++ b/sql/catalyst/src/main/java/org/apache/spark/sql/catalyst/expressions/ExpressionInfo.java
@@ -26,6 +26,10 @@ public class ExpressionInfo {
     private String name;
     private String extended;
     private String db;
+    private String arguments;
+    private String examples;
+    private String note;
+    private String since;
 
     public String getClassName() {
         return className;
@@ -43,23 +47,76 @@ public class ExpressionInfo {
         return extended;
     }
 
+    public String getSince() {
+        return since;
+    }
+
+    public String getArguments() {
+        return arguments;
+    }
+
+    public String getExamples() {
+        return examples;
+    }
+
+    public String getNote() {
+        return note;
+    }
+
     public String getDb() {
         return db;
     }
 
-    public ExpressionInfo(String className, String db, String name, String usage, String extended) {
+    public ExpressionInfo(
+            String className,
+            String db,
+            String name,
+            String usage,
+            String arguments,
+            String examples,
+            String note,
+            String since) {
+        assert name != null;
+        assert arguments != null;
+        assert examples != null;
+        assert examples.isEmpty() || examples.startsWith("\n    Examples:");
+        assert note != null;
+        assert since != null;
+
         this.className = className;
         this.db = db;
         this.name = name;
         this.usage = usage;
-        this.extended = extended;
+        this.arguments = arguments;
+        this.examples = examples;
+        this.note = note;
+        this.since = since;
+
+        // Make the extended description.
+        this.extended = arguments + examples;
+        if (this.extended.isEmpty()) {
+            this.extended = "\n    No example/argument for _FUNC_.\n";
+        }
+        if (!note.isEmpty()) {
+            this.extended += "\n    Note:\n      " + note.trim() + "\n";
+        }
+        if (!since.isEmpty()) {
+            this.extended += "\n    Since: " + since + "\n";
+        }
     }
 
     public ExpressionInfo(String className, String name) {
-        this(className, null, name, null, null);
+        this(className, null, name, null, "", "", "", "");
     }
 
     public ExpressionInfo(String className, String db, String name) {
-        this(className, db, name, null, null);
+        this(className, db, name, null, "", "", "", "");
+    }
+
+    // This is to keep the original constructor just in case.
+    public ExpressionInfo(String className, String db, String name, String usage, String extended) {
+        // `arguments` and `examples` are concatenated for the extended description. So, here
+        // simply pass the `extended` as `arguments` and an empty string for `examples`.
+        this(className, db, name, usage, extended, "", "", "");
     }
 }

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala
index 10b22ae..11538bd 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala
@@ -557,7 +557,9 @@ object FunctionRegistry {
     }
     val clazz = scala.reflect.classTag[Cast].runtimeClass
     val usage = "_FUNC_(expr) - Casts the value `expr` to the target data type `_FUNC_`."
-    (name, (new ExpressionInfo(clazz.getCanonicalName, null, name, usage, null), builder))
+    val expressionInfo =
+      new ExpressionInfo(clazz.getCanonicalName, null, name, usage, "", "", "", "")
+    (name, (expressionInfo, builder))
   }
 
   /**
@@ -567,7 +569,21 @@ object FunctionRegistry {
     val clazz = scala.reflect.classTag[T].runtimeClass
     val df = clazz.getAnnotation(classOf[ExpressionDescription])
     if (df != null) {
-      new ExpressionInfo(clazz.getCanonicalName, null, name, df.usage(), df.extended())
+      if (df.extended().isEmpty) {
+        new ExpressionInfo(
+          clazz.getCanonicalName,
+          null,
+          name,
+          df.usage(),
+          df.arguments(),
+          df.examples(),
+          df.note(),
+          df.since())
+      } else {
+        // This exists for the backward compatibility with old `ExpressionDescription`s defining
+        // the extended description in `extended()`.
+        new ExpressionInfo(clazz.getCanonicalName, null, name, df.usage(), df.extended())
+      }
     } else {
       new ExpressionInfo(clazz.getCanonicalName, name)
     }

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala
index 4859e0c..cd97304 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/CallMethodViaReflection.scala
@@ -45,7 +45,7 @@ import org.apache.spark.util.Utils
  */
 @ExpressionDescription(
   usage = "_FUNC_(class, method[, arg1[, arg2 ..]]) - Calls a method with reflection.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('java.util.UUID', 'randomUUID');
        c33fb387-8500-4bfa-81d2-6e0e3e930df2

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
index 3862e64..d949b8f 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala
@@ -164,7 +164,7 @@ object Cast {
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr AS type) - Casts the value `expr` to the target data type `type`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('10' as int);
        10

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/ApproximatePercentile.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/ApproximatePercentile.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/ApproximatePercentile.scala
index 1ec2e4a..896c009 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/ApproximatePercentile.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/ApproximatePercentile.scala
@@ -58,7 +58,7 @@ import org.apache.spark.sql.types._
       In this case, returns the approximate percentile array of column `col` at the given
       percentage array.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(10.0, array(0.5, 0.4, 0.1), 100);
        [10.0,10.0,10.0]

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala
index 94264e8..7559852 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala
@@ -26,7 +26,7 @@ import org.apache.spark.unsafe.types.CalendarInterval
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the negated value of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1);
        -1
@@ -89,7 +89,7 @@ case class UnaryPositive(child: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the absolute value of the numeric value.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(-1);
        1
@@ -141,7 +141,7 @@ object BinaryArithmetic {
 
 @ExpressionDescription(
   usage = "expr1 _FUNC_ expr2 - Returns `expr1`+`expr2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT 1 _FUNC_ 2;
        3
@@ -177,7 +177,7 @@ case class Add(left: Expression, right: Expression) extends BinaryArithmetic {
 
 @ExpressionDescription(
   usage = "expr1 _FUNC_ expr2 - Returns `expr1`-`expr2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT 2 _FUNC_ 1;
        1
@@ -213,7 +213,7 @@ case class Subtract(left: Expression, right: Expression) extends BinaryArithmeti
 
 @ExpressionDescription(
   usage = "expr1 _FUNC_ expr2 - Returns `expr1`*`expr2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT 2 _FUNC_ 3;
        6
@@ -233,7 +233,7 @@ case class Multiply(left: Expression, right: Expression) extends BinaryArithmeti
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "expr1 _FUNC_ expr2 - Returns `expr1`/`expr2`. It always performs floating point division.",
-  extended = """
+  examples = """
     Examples:
       > SELECT 3 _FUNC_ 2;
        1.5
@@ -316,7 +316,7 @@ case class Divide(left: Expression, right: Expression) extends BinaryArithmetic
 
 @ExpressionDescription(
   usage = "expr1 _FUNC_ expr2 - Returns the remainder after `expr1`/`expr2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT 2 _FUNC_ 1.8;
        0.2
@@ -403,7 +403,7 @@ case class Remainder(left: Expression, right: Expression) extends BinaryArithmet
 
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2) - Returns the positive value of `expr1` mod `expr2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(10, 3);
        1
@@ -562,7 +562,7 @@ case class Pmod(left: Expression, right: Expression) extends BinaryArithmetic {
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr, ...) - Returns the least value of all parameters, skipping null values.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(10, 9, 2, 4, 3);
        2
@@ -628,7 +628,7 @@ case class Least(children: Seq[Expression]) extends Expression {
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr, ...) - Returns the greatest value of all parameters, skipping null values.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(10, 9, 2, 4, 3);
        10

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/bitwiseExpressions.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/bitwiseExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/bitwiseExpressions.scala
index 425efbb..173481f 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/bitwiseExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/bitwiseExpressions.scala
@@ -28,7 +28,7 @@ import org.apache.spark.sql.types._
  */
 @ExpressionDescription(
   usage = "expr1 _FUNC_ expr2 - Returns the result of bitwise AND of `expr1` and `expr2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT 3 _FUNC_ 5;
        1
@@ -60,7 +60,7 @@ case class BitwiseAnd(left: Expression, right: Expression) extends BinaryArithme
  */
 @ExpressionDescription(
   usage = "expr1 _FUNC_ expr2 - Returns the result of bitwise OR of `expr1` and `expr2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT 3 _FUNC_ 5;
        7
@@ -92,7 +92,7 @@ case class BitwiseOr(left: Expression, right: Expression) extends BinaryArithmet
  */
 @ExpressionDescription(
   usage = "expr1 _FUNC_ expr2 - Returns the result of bitwise exclusive OR of `expr1` and `expr2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT 3 _FUNC_ 5;
        2
@@ -122,7 +122,7 @@ case class BitwiseXor(left: Expression, right: Expression) extends BinaryArithme
  */
 @ExpressionDescription(
   usage = "_FUNC_ expr - Returns the result of bitwise NOT of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_ 0;
        -1

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
index 83a23cc..4270b98 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala
@@ -29,7 +29,7 @@ import org.apache.spark.sql.types._
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the size of an array or a map. Returns -1 if null.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(array('b', 'd', 'c', 'a'));
        4
@@ -64,7 +64,7 @@ case class Size(child: Expression) extends UnaryExpression with ExpectsInputType
  */
 @ExpressionDescription(
   usage = "_FUNC_(map) - Returns an unordered array containing the keys of the map.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(map(1, 'a', 2, 'b'));
        [1,2]
@@ -92,7 +92,7 @@ case class MapKeys(child: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(map) - Returns an unordered array containing the values of the map.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(map(1, 'a', 2, 'b'));
        ["a","b"]
@@ -122,7 +122,7 @@ case class MapValues(child: Expression)
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(array[, ascendingOrder]) - Sorts the input array in ascending or descending order according to the natural ordering of the array elements.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(array('b', 'd', 'c', 'a'), true);
        ["a","b","c","d"]
@@ -217,7 +217,7 @@ case class SortArray(base: Expression, ascendingOrder: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(array, value) - Returns true if the array contains the value.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(array(1, 2, 3), 2);
        true

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/complexTypeCreator.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/complexTypeCreator.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/complexTypeCreator.scala
index d9eeb53..4b6574a 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/complexTypeCreator.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/complexTypeCreator.scala
@@ -32,7 +32,7 @@ import org.apache.spark.unsafe.types.UTF8String
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr, ...) - Returns an array with the given elements.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1, 2, 3);
        [1,2,3]
@@ -157,7 +157,7 @@ private [sql] object GenArrayData {
  */
 @ExpressionDescription(
   usage = "_FUNC_(key0, value0, key1, value1, ...) - Creates a map with the given key/value pairs.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1.0, '2', 3.0, '4');
        {1.0:"2",3.0:"4"}
@@ -264,6 +264,9 @@ object CreateStruct extends FunctionBuilder {
       null,
       "struct",
       "_FUNC_(col1, col2, col3, ...) - Creates a struct with the given field values.",
+      "",
+      "",
+      "",
       "")
     ("struct", (info, this))
   }
@@ -336,7 +339,7 @@ trait CreateNamedStructLike extends Expression {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(name1, val1, name2, val2, ...) - Creates a struct with the given field names and values.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_("a", 1, "b", 2, "c", 3);
        {"a":1,"b":2,"c":3}
@@ -393,7 +396,7 @@ case class CreateNamedStructUnsafe(children: Seq[Expression]) extends CreateName
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(text[, pairDelim[, keyValueDelim]]) - Creates a map after splitting the text into key/value pairs using delimiters. Default delimiters are ',' for `pairDelim` and ':' for `keyValueDelim`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('a:1,b:2,c:3', ',', ':');
        map("a":"1","b":"2","c":"3")

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/conditionalExpressions.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/conditionalExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/conditionalExpressions.scala
index ae8efb6..b59b6de 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/conditionalExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/conditionalExpressions.scala
@@ -25,7 +25,7 @@ import org.apache.spark.sql.types._
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2, expr3) - If `expr1` evaluates to true, then returns `expr2`; otherwise returns `expr3`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1 < 2, 'a', 'b');
        a

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
index 505ed94..7dc32e1 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/datetimeExpressions.scala
@@ -137,7 +137,7 @@ case class CurrentBatchTimestamp(
  */
 @ExpressionDescription(
   usage = "_FUNC_(start_date, num_days) - Returns the date that is `num_days` after `start_date`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-07-30', 1);
        2016-07-31
@@ -170,7 +170,7 @@ case class DateAdd(startDate: Expression, days: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(start_date, num_days) - Returns the date that is `num_days` before `start_date`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-07-30', 1);
        2016-07-29
@@ -199,7 +199,7 @@ case class DateSub(startDate: Expression, days: Expression)
 
 @ExpressionDescription(
   usage = "_FUNC_(timestamp) - Returns the hour component of the string/timestamp.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2009-07-30 12:58:59');
        12
@@ -229,7 +229,7 @@ case class Hour(child: Expression, timeZoneId: Option[String] = None)
 
 @ExpressionDescription(
   usage = "_FUNC_(timestamp) - Returns the minute component of the string/timestamp.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2009-07-30 12:58:59');
        58
@@ -259,7 +259,7 @@ case class Minute(child: Expression, timeZoneId: Option[String] = None)
 
 @ExpressionDescription(
   usage = "_FUNC_(timestamp) - Returns the second component of the string/timestamp.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2009-07-30 12:58:59');
        59
@@ -289,7 +289,7 @@ case class Second(child: Expression, timeZoneId: Option[String] = None)
 
 @ExpressionDescription(
   usage = "_FUNC_(date) - Returns the day of year of the date/timestamp.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-04-09');
        100
@@ -312,7 +312,7 @@ case class DayOfYear(child: Expression) extends UnaryExpression with ImplicitCas
 
 @ExpressionDescription(
   usage = "_FUNC_(date) - Returns the year component of the date/timestamp.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-07-30');
        2016
@@ -335,7 +335,7 @@ case class Year(child: Expression) extends UnaryExpression with ImplicitCastInpu
 
 @ExpressionDescription(
   usage = "_FUNC_(date) - Returns the quarter of the year for date, in the range 1 to 4.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-08-31');
        3
@@ -358,7 +358,7 @@ case class Quarter(child: Expression) extends UnaryExpression with ImplicitCastI
 
 @ExpressionDescription(
   usage = "_FUNC_(date) - Returns the month component of the date/timestamp.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-07-30');
        7
@@ -381,7 +381,7 @@ case class Month(child: Expression) extends UnaryExpression with ImplicitCastInp
 
 @ExpressionDescription(
   usage = "_FUNC_(date) - Returns the day of month of the date/timestamp.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2009-07-30');
        30
@@ -405,7 +405,7 @@ case class DayOfMonth(child: Expression) extends UnaryExpression with ImplicitCa
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(date) - Returns the day of the week for date/timestamp (1 = Sunday, 2 = Monday, ..., 7 = Saturday).",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2009-07-30');
        5
@@ -443,7 +443,7 @@ case class DayOfWeek(child: Expression) extends UnaryExpression with ImplicitCas
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(date) - Returns the week of the year of the given date. A week is considered to start on a Monday and week 1 is the first week with >3 days.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2008-02-20');
        8
@@ -489,7 +489,7 @@ case class WeekOfYear(child: Expression) extends UnaryExpression with ImplicitCa
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(timestamp, fmt) - Converts `timestamp` to a value of string in the format specified by the date format `fmt`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-04-08', 'y');
        2016
@@ -530,7 +530,7 @@ case class DateFormatClass(left: Expression, right: Expression, timeZoneId: Opti
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr[, pattern]) - Returns the UNIX timestamp of the given time.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-04-08', 'yyyy-MM-dd');
        1460041200
@@ -568,7 +568,7 @@ case class ToUnixTimestamp(
  */
 @ExpressionDescription(
   usage = "_FUNC_([expr[, pattern]]) - Returns the UNIX timestamp of current or specified time.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_();
        1476884637
@@ -720,7 +720,7 @@ abstract class UnixTime
  */
 @ExpressionDescription(
   usage = "_FUNC_(unix_time, format) - Returns `unix_time` in the specified `format`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0, 'yyyy-MM-dd HH:mm:ss');
        1970-01-01 00:00:00
@@ -829,7 +829,7 @@ case class FromUnixTime(sec: Expression, format: Expression, timeZoneId: Option[
  */
 @ExpressionDescription(
   usage = "_FUNC_(date) - Returns the last day of the month which the date belongs to.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2009-01-12');
        2009-01-31
@@ -863,7 +863,7 @@ case class LastDay(startDate: Expression) extends UnaryExpression with ImplicitC
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(start_date, day_of_week) - Returns the first date which is later than `start_date` and named as indicated.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2015-01-14', 'TU');
        2015-01-20
@@ -964,7 +964,7 @@ case class TimeAdd(start: Expression, interval: Expression, timeZoneId: Option[S
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(timestamp, timezone) - Given a timestamp, which corresponds to a certain time of day in UTC, returns another timestamp that corresponds to the same time of day in the given timezone.",
-  extended = """
+  examples = """
     Examples:
       > SELECT from_utc_timestamp('2016-08-31', 'Asia/Seoul');
        2016-08-31 09:00:00
@@ -1057,7 +1057,7 @@ case class TimeSub(start: Expression, interval: Expression, timeZoneId: Option[S
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(start_date, num_months) - Returns the date that is `num_months` after `start_date`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-08-31', 1);
        2016-09-30
@@ -1093,7 +1093,7 @@ case class AddMonths(startDate: Expression, numMonths: Expression)
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(timestamp1, timestamp2) - Returns number of months between `timestamp1` and `timestamp2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('1997-02-28 10:30:00', '1996-10-30');
        3.94959677
@@ -1136,7 +1136,7 @@ case class MonthsBetween(date1: Expression, date2: Expression, timeZoneId: Optio
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(timestamp, timezone) - Given a timestamp, which corresponds to a certain time of day in the given timezone, returns another timestamp that corresponds to the same time of day in UTC.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-08-31', 'Asia/Seoul');
        2016-08-30 15:00:00
@@ -1197,7 +1197,7 @@ case class ToUTCTimestamp(left: Expression, right: Expression)
       a date. Returns null with invalid input. By default, it follows casting rules to a date if
       the `fmt` is omitted.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2009-07-30 04:17:52');
        2009-07-30
@@ -1238,7 +1238,7 @@ case class ParseToDate(left: Expression, format: Option[Expression], child: Expr
       a timestamp. Returns null with invalid input. By default, it follows casting rules to
       a timestamp if the `fmt` is omitted.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2016-12-31 00:12:00');
        2016-12-31 00:12:00
@@ -1273,7 +1273,7 @@ case class ParseToTimestamp(left: Expression, format: Option[Expression], child:
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(date, fmt) - Returns `date` with the time portion of the day truncated to the unit specified by the format model `fmt`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2009-02-12', 'MM');
        2009-02-01
@@ -1352,7 +1352,7 @@ case class TruncDate(date: Expression, format: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(endDate, startDate) - Returns the number of days from `startDate` to `endDate`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('2009-07-31', '2009-07-30');
        1

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/generators.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/generators.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/generators.scala
index c217aa8..8618f49 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/generators.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/generators.scala
@@ -127,7 +127,7 @@ case class UserDefinedGenerator(
  */
 @ExpressionDescription(
   usage = "_FUNC_(n, expr1, ..., exprk) - Separates `expr1`, ..., `exprk` into `n` rows.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(2, 1, 2, 3);
        1  2
@@ -324,7 +324,7 @@ abstract class ExplodeBase extends UnaryExpression with CollectionGenerator with
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Separates the elements of array `expr` into multiple rows, or the elements of map `expr` into multiple rows and columns.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(array(10, 20));
        10
@@ -347,7 +347,7 @@ case class Explode(child: Expression) extends ExplodeBase {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Separates the elements of array `expr` into multiple rows with positions, or the elements of map `expr` into multiple rows and columns with positions.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(array(10,20));
        0  10
@@ -363,7 +363,7 @@ case class PosExplode(child: Expression) extends ExplodeBase {
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Explodes an array of structs into a table.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(array(struct(1, 'a'), struct(2, 'b')));
        1  a

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/hash.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/hash.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/hash.scala
index 2476fc9..1e64d21 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/hash.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/hash.scala
@@ -44,7 +44,7 @@ import org.apache.spark.unsafe.Platform
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns an MD5 128-bit checksum as a hex string of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark');
        8cde774d6f7333752ed72cacddb05126
@@ -78,7 +78,7 @@ case class Md5(child: Expression) extends UnaryExpression with ImplicitCastInput
     _FUNC_(expr, bitLength) - Returns a checksum of SHA-2 family as a hex string of `expr`.
       SHA-224, SHA-256, SHA-384, and SHA-512 are supported. Bit length of 0 is equivalent to 256.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark', 256);
        529bc3b07127ecb7e53a4dcf1991d9152c24537d919178022b2c42657f79a26b
@@ -151,7 +151,7 @@ case class Sha2(left: Expression, right: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns a sha1 hash value as a hex string of the `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark');
        85f5955f4b27a9a4c2aab6ffe5d7189fc298b92c
@@ -178,7 +178,7 @@ case class Sha1(child: Expression) extends UnaryExpression with ImplicitCastInpu
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns a cyclic redundancy check value of the `expr` as a bigint.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark');
        1557323817
@@ -522,7 +522,7 @@ abstract class InterpretedHashFunction {
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2, ...) - Returns a hash value of the arguments.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark', array(123), 2);
        -1321691492

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/jsonExpressions.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/jsonExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/jsonExpressions.scala
index 6b90354..6f451fe 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/jsonExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/jsonExpressions.scala
@@ -112,7 +112,7 @@ private[this] object SharedFactory {
  */
 @ExpressionDescription(
   usage = "_FUNC_(json_txt, path) - Extracts a json object from `path`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('{"a":"b"}', '$.a');
        b
@@ -335,7 +335,7 @@ case class GetJsonObject(json: Expression, path: Expression)
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(jsonStr, p1, p2, ..., pn) - Returns a tuple like the function get_json_object, but it takes multiple names. All the input parameters and output column types are string.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('{"a":1, "b":2}', 'a', 'b');
        1  2
@@ -492,13 +492,14 @@ case class JsonTuple(children: Seq[Expression])
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(jsonStr, schema[, options]) - Returns a struct value with the given `jsonStr` and `schema`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('{"a":1, "b":0.8}', 'a INT, b DOUBLE');
        {"a":1, "b":0.8}
       > SELECT _FUNC_('{"time":"26/08/2015"}', 'time Timestamp', map('timestampFormat', 'dd/MM/yyyy'));
        {"time":"2015-08-26 00:00:00.0"}
-  """)
+  """,
+  since = "2.2.0")
 // scalastyle:on line.size.limit
 case class JsonToStructs(
     schema: DataType,
@@ -600,7 +601,7 @@ case class JsonToStructs(
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr[, options]) - Returns a json string with a given struct value",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(named_struct('a', 1, 'b', 2));
        {"a":1,"b":2}
@@ -608,7 +609,8 @@ case class JsonToStructs(
        {"time":"26/08/2015"}
       > SELECT _FUNC_(array(named_struct('a', 1, 'b', 2));
        [{"a":1,"b":2}]
-  """)
+  """,
+  since = "2.2.0")
 // scalastyle:on line.size.limit
 case class StructsToJson(
     options: Map[String, String],

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala
index 6152562..5c54ffa 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/mathExpressions.scala
@@ -142,7 +142,7 @@ abstract class BinaryMathExpression(f: (Double, Double) => Double, name: String)
  */
 @ExpressionDescription(
   usage = "_FUNC_() - Returns Euler's number, e.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_();
        2.718281828459045
@@ -155,7 +155,7 @@ case class EulerNumber() extends LeafMathExpression(math.E, "E")
  */
 @ExpressionDescription(
   usage = "_FUNC_() - Returns pi.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_();
        3.141592653589793
@@ -171,7 +171,7 @@ case class Pi() extends LeafMathExpression(math.Pi, "PI")
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the inverse cosine (a.k.a. arccosine) of `expr` if -1<=`expr`<=1 or NaN otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1);
        0.0
@@ -184,7 +184,7 @@ case class Acos(child: Expression) extends UnaryMathExpression(math.acos, "ACOS"
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the inverse sine (a.k.a. arcsine) the arc sin of `expr` if -1<=`expr`<=1 or NaN otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        0.0
@@ -197,7 +197,7 @@ case class Asin(child: Expression) extends UnaryMathExpression(math.asin, "ASIN"
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the inverse tangent (a.k.a. arctangent).",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        0.0
@@ -207,7 +207,7 @@ case class Atan(child: Expression) extends UnaryMathExpression(math.atan, "ATAN"
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the cube root of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(27.0);
        3.0
@@ -216,7 +216,7 @@ case class Cbrt(child: Expression) extends UnaryMathExpression(math.cbrt, "CBRT"
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the smallest integer not smaller than `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(-0.1);
        0
@@ -253,7 +253,7 @@ case class Ceil(child: Expression) extends UnaryMathExpression(math.ceil, "CEIL"
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the cosine of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        1.0
@@ -262,7 +262,7 @@ case class Cos(child: Expression) extends UnaryMathExpression(math.cos, "COS")
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the hyperbolic cosine of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        1.0
@@ -278,7 +278,7 @@ case class Cosh(child: Expression) extends UnaryMathExpression(math.cosh, "COSH"
  */
 @ExpressionDescription(
   usage = "_FUNC_(num, from_base, to_base) - Convert `num` from `from_base` to `to_base`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('100', 2, 10);
        4
@@ -315,7 +315,7 @@ case class Conv(numExpr: Expression, fromBaseExpr: Expression, toBaseExpr: Expre
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns e to the power of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        1.0
@@ -324,7 +324,7 @@ case class Exp(child: Expression) extends UnaryMathExpression(math.exp, "EXP")
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns exp(`expr`) - 1.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        0.0
@@ -333,7 +333,7 @@ case class Expm1(child: Expression) extends UnaryMathExpression(math.expm1, "EXP
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the largest integer not greater than `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(-0.1);
        -1
@@ -401,7 +401,7 @@ object Factorial {
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the factorial of `expr`. `expr` is [0..20]. Otherwise, null.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(5);
        120
@@ -440,7 +440,7 @@ case class Factorial(child: Expression) extends UnaryExpression with ImplicitCas
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the natural logarithm (base e) of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1);
        0.0
@@ -449,7 +449,7 @@ case class Log(child: Expression) extends UnaryLogExpression(math.log, "LOG")
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the logarithm of `expr` with base 2.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(2);
        1.0
@@ -471,7 +471,7 @@ case class Log2(child: Expression)
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the logarithm of `expr` with base 10.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(10);
        1.0
@@ -480,7 +480,7 @@ case class Log10(child: Expression) extends UnaryLogExpression(math.log10, "LOG1
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns log(1 + `expr`).",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        0.0
@@ -492,7 +492,7 @@ case class Log1p(child: Expression) extends UnaryLogExpression(math.log1p, "LOG1
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the double value that is closest in value to the argument and is equal to a mathematical integer.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(12.3456);
        12.0
@@ -504,7 +504,7 @@ case class Rint(child: Expression) extends UnaryMathExpression(math.rint, "ROUND
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns -1.0, 0.0 or 1.0 as `expr` is negative, 0 or positive.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(40);
        1.0
@@ -513,7 +513,7 @@ case class Signum(child: Expression) extends UnaryMathExpression(math.signum, "S
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the sine of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        0.0
@@ -522,7 +522,7 @@ case class Sin(child: Expression) extends UnaryMathExpression(math.sin, "SIN")
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the hyperbolic sine of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        0.0
@@ -531,7 +531,7 @@ case class Sinh(child: Expression) extends UnaryMathExpression(math.sinh, "SINH"
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the square root of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(4);
        2.0
@@ -540,7 +540,7 @@ case class Sqrt(child: Expression) extends UnaryMathExpression(math.sqrt, "SQRT"
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the tangent of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        0.0
@@ -549,7 +549,7 @@ case class Tan(child: Expression) extends UnaryMathExpression(math.tan, "TAN")
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the cotangent of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1);
        0.6420926159343306
@@ -563,7 +563,7 @@ case class Cot(child: Expression)
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the hyperbolic tangent of `expr`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0);
        0.0
@@ -572,7 +572,7 @@ case class Tanh(child: Expression) extends UnaryMathExpression(math.tanh, "TANH"
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Converts radians to degrees.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(3.141592653589793);
        180.0
@@ -583,7 +583,7 @@ case class ToDegrees(child: Expression) extends UnaryMathExpression(math.toDegre
 
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Converts degrees to radians.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(180);
        3.141592653589793
@@ -595,7 +595,7 @@ case class ToRadians(child: Expression) extends UnaryMathExpression(math.toRadia
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the string representation of the long value `expr` represented in binary.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(13);
        1101
@@ -698,7 +698,7 @@ object Hex {
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Converts `expr` to hexadecimal.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(17);
        11
@@ -735,7 +735,7 @@ case class Hex(child: Expression) extends UnaryExpression with ImplicitCastInput
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Converts hexadecimal `expr` to binary.",
-  extended = """
+  examples = """
     Examples:
       > SELECT decode(_FUNC_('537061726B2053514C'), 'UTF-8');
        Spark SQL
@@ -771,7 +771,7 @@ case class Unhex(child: Expression) extends UnaryExpression with ImplicitCastInp
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2) - Returns the angle in radians between the positive x-axis of a plane and the point given by the coordinates (`expr1`, `expr2`).",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0, 0);
        0.0
@@ -792,7 +792,7 @@ case class Atan2(left: Expression, right: Expression)
 
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2) - Raises `expr1` to the power of `expr2`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(2, 3);
        8.0
@@ -813,7 +813,7 @@ case class Pow(left: Expression, right: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(base, expr) - Bitwise left shift.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(2, 1);
        4
@@ -847,7 +847,7 @@ case class ShiftLeft(left: Expression, right: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(base, expr) - Bitwise (signed) right shift.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(4, 1);
        2
@@ -881,7 +881,7 @@ case class ShiftRight(left: Expression, right: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(base, expr) - Bitwise unsigned right shift.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(4, 1);
        2
@@ -908,7 +908,7 @@ case class ShiftRightUnsigned(left: Expression, right: Expression)
 
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2) - Returns sqrt(`expr1`**2 + `expr2`**2).",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(3, 4);
        5.0
@@ -925,7 +925,7 @@ case class Hypot(left: Expression, right: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(base, expr) - Returns the logarithm of `expr` with `base`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(10, 100);
        2.0
@@ -1155,7 +1155,7 @@ abstract class RoundBase(child: Expression, scale: Expression,
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr, d) - Returns `expr` rounded to `d` decimal places using HALF_UP rounding mode.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(2.5, 0);
        3.0
@@ -1175,7 +1175,7 @@ case class Round(child: Expression, scale: Expression)
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr, d) - Returns `expr` rounded to `d` decimal places using HALF_EVEN rounding mode.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(2.5, 0);
        2.0

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/misc.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/misc.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/misc.scala
index 3fc4bb7..ef293ff 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/misc.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/misc.scala
@@ -50,7 +50,7 @@ case class PrintToStderr(child: Expression) extends UnaryExpression {
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Throws an exception if `expr` is not true.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(0 < 1);
        NULL
@@ -96,7 +96,7 @@ case class AssertTrue(child: Expression) extends UnaryExpression with ImplicitCa
  */
 @ExpressionDescription(
   usage = "_FUNC_() - Returns the current database.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_();
        default
@@ -111,7 +111,7 @@ case class CurrentDatabase() extends LeafExpression with Unevaluable {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_() - Returns an universally unique identifier (UUID) string. The value is returned as a canonical UUID 36-character string.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_();
        46707d92-02f4-4817-8116-a4c3b23e6266

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/nullExpressions.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/nullExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/nullExpressions.scala
index 1b62514..62786e1 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/nullExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/nullExpressions.scala
@@ -37,7 +37,7 @@ import org.apache.spark.sql.types._
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2, ...) - Returns the first non-null argument if exists. Otherwise, null.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(NULL, 1, NULL);
        1
@@ -97,7 +97,7 @@ case class Coalesce(children: Seq[Expression]) extends Expression {
 
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2) - Returns `expr2` if `expr1` is null, or `expr1` otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(NULL, array('2'));
        ["2"]
@@ -116,7 +116,7 @@ case class IfNull(left: Expression, right: Expression, child: Expression)
 
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2) - Returns null if `expr1` equals to `expr2`, or `expr1` otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(2, 2);
        NULL
@@ -135,7 +135,7 @@ case class NullIf(left: Expression, right: Expression, child: Expression)
 
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2) - Returns `expr2` if `expr1` is null, or `expr1` otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(NULL, array('2'));
        ["2"]
@@ -154,7 +154,7 @@ case class Nvl(left: Expression, right: Expression, child: Expression) extends R
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2, expr3) - Returns `expr2` if `expr1` is not null, or `expr3` otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(NULL, 2, 1);
        1
@@ -177,7 +177,7 @@ case class Nvl2(expr1: Expression, expr2: Expression, expr3: Expression, child:
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns true if `expr` is NaN, or false otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(cast('NaN' as double));
        true
@@ -219,7 +219,7 @@ case class IsNaN(child: Expression) extends UnaryExpression
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr1, expr2) - Returns `expr1` if it's not NaN, or `expr2` otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(cast('NaN' as double), 123);
        123.0
@@ -279,7 +279,7 @@ case class NaNvl(left: Expression, right: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns true if `expr` is null, or false otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1);
        false
@@ -305,7 +305,7 @@ case class IsNull(child: Expression) extends UnaryExpression with Predicate {
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns true if `expr` is not null, or false otherwise.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1);
        true

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/randomExpressions.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/randomExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/randomExpressions.scala
index 1d7a3c7..9705176 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/randomExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/randomExpressions.scala
@@ -60,7 +60,7 @@ abstract class RDG extends UnaryExpression with ExpectsInputTypes with Nondeterm
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_([seed]) - Returns a random value with independent and identically distributed (i.i.d.) uniformly distributed values in [0, 1).",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_();
        0.9629742951434543
@@ -95,7 +95,7 @@ object Rand {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_([seed]) - Returns a random value with independent and identically distributed (i.i.d.) values drawn from the standard normal distribution.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_();
        -0.3254147983080288

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/regexpExpressions.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/regexpExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/regexpExpressions.scala
index 5418ace..d0d663f 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/regexpExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/regexpExpressions.scala
@@ -71,34 +71,35 @@ abstract class StringRegexExpression extends BinaryExpression
 @ExpressionDescription(
   usage = "str _FUNC_ pattern - Returns true if str matches pattern, " +
     "null if any arguments are null, false otherwise.",
-  extended = """
+  arguments = """
     Arguments:
-      str - a string expression
-      pattern - a string expression. The pattern is a string which is matched literally, with
-        exception to the following special symbols:
+      * str - a string expression
+      * pattern - a string expression. The pattern is a string which is matched literally, with
+          exception to the following special symbols:
 
           _ matches any one character in the input (similar to . in posix regular expressions)
 
           % matches zero or more characters in the input (similar to .* in posix regular
           expressions)
 
-        The escape character is '\'. If an escape character precedes a special symbol or another
-        escape character, the following character is matched literally. It is invalid to escape
-        any other character.
+          The escape character is '\'. If an escape character precedes a special symbol or another
+          escape character, the following character is matched literally. It is invalid to escape
+          any other character.
 
-        Since Spark 2.0, string literals are unescaped in our SQL parser. For example, in order
-        to match "\abc", the pattern should be "\\abc".
-
-        When SQL config 'spark.sql.parser.escapedStringLiterals' is enabled, it fallbacks
-        to Spark 1.6 behavior regarding string literal parsing. For example, if the config is
-        enabled, the pattern to match "\abc" should be "\abc".
+          Since Spark 2.0, string literals are unescaped in our SQL parser. For example, in order
+          to match "\abc", the pattern should be "\\abc".
 
+          When SQL config 'spark.sql.parser.escapedStringLiterals' is enabled, it fallbacks
+          to Spark 1.6 behavior regarding string literal parsing. For example, if the config is
+          enabled, the pattern to match "\abc" should be "\abc".
+  """,
+  examples = """
     Examples:
       > SELECT '%SystemDrive%\Users\John' _FUNC_ '\%SystemDrive\%\\Users%'
       true
-
-    See also:
-      Use RLIKE to match with standard regular expressions.
+  """,
+  note = """
+    Use RLIKE to match with standard regular expressions.
   """)
 case class Like(left: Expression, right: Expression) extends StringRegexExpression {
 
@@ -152,18 +153,20 @@ case class Like(left: Expression, right: Expression) extends StringRegexExpressi
 
 @ExpressionDescription(
   usage = "str _FUNC_ regexp - Returns true if `str` matches `regexp`, or false otherwise.",
-  extended = """
+  arguments = """
     Arguments:
-      str - a string expression
-      regexp - a string expression. The pattern string should be a Java regular expression.
-
-        Since Spark 2.0, string literals (including regex patterns) are unescaped in our SQL parser.
-        For example, to match "\abc", a regular expression for `regexp` can be "^\\abc$".
-
-        There is a SQL config 'spark.sql.parser.escapedStringLiterals' that can be used to fallback
-        to the Spark 1.6 behavior regarding string literal parsing. For example, if the config is
-        enabled, the `regexp` that can match "\abc" is "^\abc$".
-
+      * str - a string expression
+      * regexp - a string expression. The pattern string should be a Java regular expression.
+
+          Since Spark 2.0, string literals (including regex patterns) are unescaped in our SQL
+          parser. For example, to match "\abc", a regular expression for `regexp` can be
+          "^\\abc$".
+
+          There is a SQL config 'spark.sql.parser.escapedStringLiterals' that can be used to
+          fallback to the Spark 1.6 behavior regarding string literal parsing. For example,
+          if the config is enabled, the `regexp` that can match "\abc" is "^\abc$".
+  """,
+  examples = """
     Examples:
       When spark.sql.parser.escapedStringLiterals is disabled (default).
       > SELECT '%SystemDrive%\Users\John' _FUNC_ '%SystemDrive%\\Users.*'
@@ -172,9 +175,9 @@ case class Like(left: Expression, right: Expression) extends StringRegexExpressi
       When spark.sql.parser.escapedStringLiterals is enabled.
       > SELECT '%SystemDrive%\Users\John' _FUNC_ '%SystemDrive%\Users.*'
       true
-
-    See also:
-      Use LIKE to match with simple string pattern.
+  """,
+  note = """
+    Use LIKE to match with simple string pattern.
   """)
 case class RLike(left: Expression, right: Expression) extends StringRegexExpression {
 
@@ -229,7 +232,7 @@ case class RLike(left: Expression, right: Expression) extends StringRegexExpress
  */
 @ExpressionDescription(
   usage = "_FUNC_(str, regex) - Splits `str` around occurrences that match `regex`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('oneAtwoBthreeC', '[ABC]');
        ["one","two","three",""]
@@ -266,7 +269,7 @@ case class StringSplit(str: Expression, pattern: Expression)
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(str, regexp, rep) - Replaces all substrings of `str` that match `regexp` with `rep`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('100-200', '(\d+)', 'num');
        num-num
@@ -373,7 +376,7 @@ case class RegExpReplace(subject: Expression, regexp: Expression, rep: Expressio
  */
 @ExpressionDescription(
   usage = "_FUNC_(str, regexp[, idx]) - Extracts a group that matches `regexp`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('100-200', '(\d+)-(\d+)', 1);
        100

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
index d75b9d6..7ab45a6 100755
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/stringExpressions.scala
@@ -42,7 +42,7 @@ import org.apache.spark.unsafe.types.{ByteArray, UTF8String}
  */
 @ExpressionDescription(
   usage = "_FUNC_(str1, str2, ..., strN) - Returns the concatenation of str1, str2, ..., strN.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark', 'SQL');
        SparkSQL
@@ -85,7 +85,7 @@ case class Concat(children: Seq[Expression]) extends Expression with ImplicitCas
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(sep, [str | array(str)]+) - Returns the concatenation of the strings separated by `sep`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(' ', 'Spark', 'SQL');
         Spark SQL
@@ -179,7 +179,7 @@ case class ConcatWs(children: Seq[Expression])
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(n, str1, str2, ...) - Returns the `n`-th string, e.g., returns `str2` when `n` is 2.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(1, 'scala', 'java');
        scala
@@ -262,7 +262,7 @@ trait String2StringExpression extends ImplicitCastInputTypes {
  */
 @ExpressionDescription(
   usage = "_FUNC_(str) - Returns `str` with all characters changed to uppercase.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('SparkSql');
        SPARKSQL
@@ -282,7 +282,7 @@ case class Upper(child: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(str) - Returns `str` with all characters changed to lowercase.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('SparkSql');
        sparksql
@@ -346,13 +346,14 @@ case class EndsWith(left: Expression, right: Expression) extends StringPredicate
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(str, search[, replace]) - Replaces all occurrences of `search` with `replace`.",
-  extended = """
+  arguments = """
     Arguments:
-      str - a string expression
-      search - a string expression. If `search` is not found in `str`, `str` is returned unchanged.
-      replace - a string expression. If `replace` is not specified or is an empty string, nothing replaces
-                the string that is removed from `str`.
-
+      * str - a string expression
+      * search - a string expression. If `search` is not found in `str`, `str` is returned unchanged.
+      * replace - a string expression. If `replace` is not specified or is an empty string, nothing replaces
+          the string that is removed from `str`.
+  """,
+  examples = """
     Examples:
       > SELECT _FUNC_('ABCabc', 'abc', 'DEF');
        ABCDEF
@@ -410,7 +411,7 @@ object StringTranslate {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(input, from, to) - Translates the `input` string by replacing the characters present in the `from` string with the corresponding characters in the `to` string.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('AaBbCc', 'abc', '123');
        A1B2C3
@@ -477,7 +478,7 @@ case class StringTranslate(srcExpr: Expression, matchingExpr: Expression, replac
     _FUNC_(str, str_array) - Returns the index (1-based) of the given string (`str`) in the comma-delimited list (`str_array`).
       Returns 0, if the string was not found or if the given string (`str`) contains a comma.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('ab','abc,b,ab,c,def');
        3
@@ -507,7 +508,7 @@ case class FindInSet(left: Expression, right: Expression) extends BinaryExpressi
  */
 @ExpressionDescription(
   usage = "_FUNC_(str) - Removes the leading and trailing space characters from `str`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('    SparkSQL   ');
        SparkSQL
@@ -529,7 +530,7 @@ case class StringTrim(child: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(str) - Removes the leading and trailing space characters from `str`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('    SparkSQL');
        SparkSQL
@@ -551,7 +552,7 @@ case class StringTrimLeft(child: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(str) - Removes the trailing space characters from `str`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('    SparkSQL   ');
            SparkSQL
@@ -577,7 +578,7 @@ case class StringTrimRight(child: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(str, substr) - Returns the (1-based) index of the first occurrence of `substr` in `str`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('SparkSQL', 'SQL');
        6
@@ -617,7 +618,7 @@ case class StringInstr(str: Expression, substr: Expression)
       (counting from the right) is returned. The function substring_index performs a case-sensitive match
       when searching for `delim`.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('www.apache.org', '.', 2);
        www.apache
@@ -652,7 +653,7 @@ case class SubstringIndex(strExpr: Expression, delimExpr: Expression, countExpr:
     _FUNC_(substr, str[, pos]) - Returns the position of the first occurrence of `substr` in `str` after position `pos`.
       The given `pos` and return value are 1-based.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('bar', 'foobarbar');
        4
@@ -739,7 +740,7 @@ case class StringLocate(substr: Expression, str: Expression, start: Expression)
     _FUNC_(str, len, pad) - Returns `str`, left-padded with `pad` to a length of `len`.
       If `str` is longer than `len`, the return value is shortened to `len` characters.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('hi', 5, '??');
        ???hi
@@ -772,7 +773,7 @@ case class StringLPad(str: Expression, len: Expression, pad: Expression)
     _FUNC_(str, len, pad) - Returns `str`, right-padded with `pad` to a length of `len`.
       If `str` is longer than `len`, the return value is shortened to `len` characters.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('hi', 5, '??');
        hi???
@@ -815,7 +816,7 @@ object ParseUrl {
  */
 @ExpressionDescription(
   usage = "_FUNC_(url, partToExtract[, key]) - Extracts a part from a URL.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('http://spark.apache.org/path?query=1', 'HOST')
        spark.apache.org
@@ -970,7 +971,7 @@ case class ParseUrl(children: Seq[Expression])
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(strfmt, obj, ...) - Returns a formatted string from printf-style format strings.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_("Hello World %d %s", 100, "days");
        Hello World 100 days
@@ -1049,7 +1050,7 @@ case class FormatString(children: Expression*) extends Expression with ImplicitC
     _FUNC_(str) - Returns `str` with the first letter of each word in uppercase.
       All other letters are in lowercase. Words are delimited by white space.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('sPark sql');
        Spark Sql
@@ -1072,7 +1073,7 @@ case class InitCap(child: Expression) extends UnaryExpression with ImplicitCastI
  */
 @ExpressionDescription(
   usage = "_FUNC_(str, n) - Returns the string which repeats the given string value n times.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('123', 2);
        123123
@@ -1101,7 +1102,7 @@ case class StringRepeat(str: Expression, times: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(str) - Returns the reversed given string.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark SQL');
        LQS krapS
@@ -1121,7 +1122,7 @@ case class StringReverse(child: Expression) extends UnaryExpression with String2
  */
 @ExpressionDescription(
   usage = "_FUNC_(n) - Returns a string consisting of `n` spaces.",
-  extended = """
+  examples = """
     Examples:
       > SELECT concat(_FUNC_(2), '1');
          1
@@ -1154,7 +1155,7 @@ case class StringSpace(child: Expression)
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(str, pos[, len]) - Returns the substring of `str` that starts at `pos` and is of length `len`, or the slice of byte array that starts at `pos` and is of length `len`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark SQL', 5);
        k SQL
@@ -1204,7 +1205,7 @@ case class Substring(str: Expression, pos: Expression, len: Expression)
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(str, len) - Returns the rightmost `len`(`len` can be string type) characters from the string `str`,if `len` is less or equal than 0 the result is an empty string.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark SQL', 3);
        SQL
@@ -1226,7 +1227,7 @@ case class Right(str: Expression, len: Expression, child: Expression) extends Ru
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(str, len) - Returns the leftmost `len`(`len` can be string type) characters from the string `str`,if `len` is less or equal than 0 the result is an empty string.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark SQL', 3);
        Spa
@@ -1248,7 +1249,7 @@ case class Left(str: Expression, len: Expression, child: Expression) extends Run
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the character length of `expr` or number of bytes in binary data.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark SQL');
        9
@@ -1280,7 +1281,7 @@ case class Length(child: Expression) extends UnaryExpression with ImplicitCastIn
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the bit length of `expr` or number of bits in binary data.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark SQL');
        72
@@ -1307,7 +1308,7 @@ case class BitLength(child: Expression) extends UnaryExpression with ImplicitCas
  */
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the byte length of `expr` or number of bytes in binary data.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark SQL');
        9
@@ -1334,7 +1335,7 @@ case class OctetLength(child: Expression) extends UnaryExpression with ImplicitC
  */
 @ExpressionDescription(
   usage = "_FUNC_(str1, str2) - Returns the Levenshtein distance between the two given strings.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('kitten', 'sitting');
        3
@@ -1359,7 +1360,7 @@ case class Levenshtein(left: Expression, right: Expression) extends BinaryExpres
  */
 @ExpressionDescription(
   usage = "_FUNC_(str) - Returns Soundex code of the string.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Miller');
        M460
@@ -1382,7 +1383,7 @@ case class SoundEx(child: Expression) extends UnaryExpression with ExpectsInputT
  */
 @ExpressionDescription(
   usage = "_FUNC_(str) - Returns the numeric value of the first character of `str`.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('222');
        50
@@ -1424,7 +1425,7 @@ case class Ascii(child: Expression) extends UnaryExpression with ImplicitCastInp
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(expr) - Returns the ASCII character having the binary equivalent to `expr`. If n is larger than 256 the result is equivalent to chr(n % 256)",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(65);
        A
@@ -1467,7 +1468,7 @@ case class Chr(child: Expression) extends UnaryExpression with ImplicitCastInput
  */
 @ExpressionDescription(
   usage = "_FUNC_(bin) - Converts the argument from a binary `bin` to a base 64 string.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Spark SQL');
        U3BhcmsgU1FM
@@ -1496,7 +1497,7 @@ case class Base64(child: Expression) extends UnaryExpression with ImplicitCastIn
  */
 @ExpressionDescription(
   usage = "_FUNC_(str) - Converts the argument from a base 64 string `str` to a binary.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('U3BhcmsgU1FM');
        Spark SQL
@@ -1525,7 +1526,7 @@ case class UnBase64(child: Expression) extends UnaryExpression with ImplicitCast
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(bin, charset) - Decodes the first argument using the second argument character set.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(encode('abc', 'utf-8'), 'utf-8');
        abc
@@ -1564,7 +1565,7 @@ case class Decode(bin: Expression, charset: Expression)
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(str, charset) - Encodes the first argument using the second argument character set.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('abc', 'utf-8');
        abc
@@ -1605,7 +1606,7 @@ case class Encode(value: Expression, charset: Expression)
       decimal places. If `expr2` is 0, the result has no decimal point or fractional part.
       This is supposed to function like MySQL's FORMAT.
   """,
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_(12332.123456, 4);
        12,332.1235
@@ -1738,7 +1739,7 @@ case class FormatNumber(x: Expression, d: Expression)
  */
 @ExpressionDescription(
   usage = "_FUNC_(str[, lang, country]) - Splits `str` into an array of array of words.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('Hi there! Good morning.');
        [["Hi","there"],["Good","morning"]]

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/xml/xpath.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/xml/xpath.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/xml/xpath.scala
index aa32804..d018556 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/xml/xpath.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/xml/xpath.scala
@@ -58,7 +58,7 @@ abstract class XPathExtract extends BinaryExpression with ExpectsInputTypes with
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(xml, xpath) - Returns true if the XPath expression evaluates to true, or if a matching node is found.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('<a><b>1</b></a>','a/b');
        true
@@ -77,7 +77,7 @@ case class XPathBoolean(xml: Expression, path: Expression) extends XPathExtract
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(xml, xpath) - Returns a short integer value, or the value zero if no match is found, or a match is found but the value is non-numeric.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('<a><b>1</b><b>2</b></a>', 'sum(a/b)');
        3
@@ -96,7 +96,7 @@ case class XPathShort(xml: Expression, path: Expression) extends XPathExtract {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(xml, xpath) - Returns an integer value, or the value zero if no match is found, or a match is found but the value is non-numeric.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('<a><b>1</b><b>2</b></a>', 'sum(a/b)');
        3
@@ -115,7 +115,7 @@ case class XPathInt(xml: Expression, path: Expression) extends XPathExtract {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(xml, xpath) - Returns a long integer value, or the value zero if no match is found, or a match is found but the value is non-numeric.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('<a><b>1</b><b>2</b></a>', 'sum(a/b)');
        3
@@ -134,7 +134,7 @@ case class XPathLong(xml: Expression, path: Expression) extends XPathExtract {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(xml, xpath) - Returns a float value, the value zero if no match is found, or NaN if a match is found but the value is non-numeric.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('<a><b>1</b><b>2</b></a>', 'sum(a/b)');
        3.0
@@ -153,7 +153,7 @@ case class XPathFloat(xml: Expression, path: Expression) extends XPathExtract {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(xml, xpath) - Returns a double value, the value zero if no match is found, or NaN if a match is found but the value is non-numeric.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('<a><b>1</b><b>2</b></a>', 'sum(a/b)');
        3.0
@@ -172,7 +172,7 @@ case class XPathDouble(xml: Expression, path: Expression) extends XPathExtract {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(xml, xpath) - Returns the text contents of the first xml node that matches the XPath expression.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('<a><b>b</b><c>cc</c></a>','a/c');
        cc
@@ -191,7 +191,7 @@ case class XPathString(xml: Expression, path: Expression) extends XPathExtract {
 // scalastyle:off line.size.limit
 @ExpressionDescription(
   usage = "_FUNC_(xml, xpath) - Returns a string array of values within the nodes of xml that match the XPath expression.",
-  extended = """
+  examples = """
     Examples:
       > SELECT _FUNC_('<a><b>b1</b><b>b2</b><b>b3</b><c>c1</c><c>c2</c></a>','a/b/text()');
        ['b1','b2','b3']

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/core/src/test/resources/sql-tests/results/cast.sql.out
----------------------------------------------------------------------
diff --git a/sql/core/src/test/resources/sql-tests/results/cast.sql.out b/sql/core/src/test/resources/sql-tests/results/cast.sql.out
index 4e6353b..9c5f455 100644
--- a/sql/core/src/test/resources/sql-tests/results/cast.sql.out
+++ b/sql/core/src/test/resources/sql-tests/results/cast.sql.out
@@ -194,6 +194,8 @@ DESC FUNCTION EXTENDED boolean
 struct<function_desc:string>
 -- !query 23 output
 Class: org.apache.spark.sql.catalyst.expressions.Cast
-Extended Usage:N/A.
+Extended Usage:
+    No example/argument for boolean.
+
 Function: boolean
 Usage: boolean(expr) - Casts the value `expr` to the target data type `boolean`.

http://git-wip-us.apache.org/repos/asf/spark/blob/ba327ee5/sql/core/src/test/resources/sql-tests/results/json-functions.sql.out
----------------------------------------------------------------------
diff --git a/sql/core/src/test/resources/sql-tests/results/json-functions.sql.out b/sql/core/src/test/resources/sql-tests/results/json-functions.sql.out
index fedabae..22da20d 100644
--- a/sql/core/src/test/resources/sql-tests/results/json-functions.sql.out
+++ b/sql/core/src/test/resources/sql-tests/results/json-functions.sql.out
@@ -27,6 +27,8 @@ Extended Usage:
       > SELECT to_json(array(named_struct('a', 1, 'b', 2));
        [{"a":1,"b":2}]
   
+    Since: 2.2.0
+
 Function: to_json
 Usage: to_json(expr[, options]) - Returns a json string with a given struct value
 
@@ -105,6 +107,8 @@ Extended Usage:
       > SELECT from_json('{"time":"26/08/2015"}', 'time Timestamp', map('timestampFormat', 'dd/MM/yyyy'));
        {"time":"2015-08-26 00:00:00.0"}
   
+    Since: 2.2.0
+
 Function: from_json
 Usage: from_json(jsonStr, schema[, options]) - Returns a struct value with the given `jsonStr` and `schema`.
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org