You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by viirya <gi...@git.apache.org> on 2015/11/09 10:52:50 UTC

[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

GitHub user viirya opened a pull request:

    https://github.com/apache/spark/pull/9565

    [SPARK-11593][SQL] Replace catalyst converter with RowEncoder in ScalaUDF

    JIRA: https://issues.apache.org/jira/browse/SPARK-11593
    
    We use catalyst converters to transfer catalyst type to and from scala type now. We should use `RowEncoder` to replace it.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/viirya/spark-1 rowencoder-scalaudf

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/9565.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #9565
    
----
commit 942dad7707aa250de55dfe4d873400cb0418dcdd
Author: Liang-Chi Hsieh <vi...@appier.com>
Date:   2015-11-09T09:48:23Z

    Replace catalyst converter with RowEncoder.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206886590
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55219/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155254835
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45449/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155245730
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155151173
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162441586
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/47258/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162801790
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155023721
  
    **[Test build #45358 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45358/consoleFull)** for PR 9565 at commit [`942dad7`](https://github.com/apache/spark/commit/942dad7707aa250de55dfe4d873400cb0418dcdd).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44509321
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/RowEncoder.scala ---
    @@ -109,11 +117,16 @@ object RowEncoder {
     
         case StructType(fields) =>
           val convertedFields = fields.zipWithIndex.map { case (f, i) =>
    +        val method = if (f.dataType.isInstanceOf[StructType]) {
    +          "getStruct"
    +        } else {
    +          "get"
    +        }
             If(
               Invoke(inputObject, "isNullAt", BooleanType, Literal(i) :: Nil),
               Literal.create(null, f.dataType),
               extractorsFor(
    -            Invoke(inputObject, "get", externalDataTypeFor(f.dataType), Literal(i) :: Nil),
    +            Invoke(inputObject, method, externalDataTypeFor(f.dataType), Literal(i) :: Nil),
    --- End diff --
    
    If a field is `StructType`, we explicitly call `getStruct` to take care both `Product` and `Row`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207294679
  
    **[Test build #55325 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55325/consoleFull)** for PR 9565 at commit [`2a0c319`](https://github.com/apache/spark/commit/2a0c3192e1e7872a800913fe42e4b02ed622b37c).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r46784210
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -66,980 +85,305 @@ case class ScalaUDF(
     
       */
     
    -  // Accessors used in genCode
    -  def userDefinedFunc(): AnyRef = function
    -  def getChildren(): Seq[Expression] = children
    -
    -  private[this] val f = children.size match {
    -    case 0 =>
    -      val func = function.asInstanceOf[() => Any]
    -      (input: InternalRow) => {
    -        func()
    -      }
    -
    -    case 1 =>
    -      val func = function.asInstanceOf[(Any) => Any]
    -      val child0 = children(0)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)))
    -      }
    -
    -    case 2 =>
    -      val func = function.asInstanceOf[(Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)))
    -      }
    -
    -    case 3 =>
    -      val func = function.asInstanceOf[(Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)))
    -      }
    -
    -    case 4 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)))
    -      }
    -
    -    case 5 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)))
    -      }
    -
    -    case 6 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)))
    -      }
    -
    -    case 7 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)))
    -      }
    -
    -    case 8 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)))
    -      }
    -
    -    case 9 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)))
    -      }
    -
    -    case 10 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)))
    -      }
    -
    -    case 11 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)))
    -      }
    -
    -    case 12 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)))
    -      }
    -
    -    case 13 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)))
    -      }
    -
    -    case 14 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)))
    -      }
    -
    -    case 15 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)))
    -      }
    -
    -    case 16 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)))
    -      }
    -
    -    case 17 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)))
    -      }
    -
    -    case 18 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)))
    -      }
    -
    -    case 19 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)))
    -      }
    -
    -    case 20 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)))
    -      }
    -
    -    case 21 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)))
    -      }
    -
    -    case 22 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      val child21 = children(21)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      lazy val converter21 = CatalystTypeConverters.createToScalaConverter(child21.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)),
    -          converter21(child21.eval(input)))
    -      }
    +  private[this] val f = {
    +    lazy val inputEncoder: ExpressionEncoder[Row] = RowEncoder(inputSchema)
    +    children.size match {
    +      case 0 =>
    +        val func = function.asInstanceOf[() => Any]
    +        (input: InternalRow) => {
    +          func()
    +        }
    +
    +      case 1 =>
    +        val func = function.asInstanceOf[(Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0))
    +        }
    +
    +      case 2 =>
    +        val func = function.asInstanceOf[(Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1))
    +        }
    +
    +      case 3 =>
    +        val func = function.asInstanceOf[(Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2))
    +        }
    +
    +      case 4 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3))
    +        }
    +
    +      case 5 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4))
    +        }
    +
    +      case 6 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5))
    +        }
    +
    +      case 7 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6))
    +        }
    +
    +      case 8 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7))
    +        }
    +
    +      case 9 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8))
    +        }
    +
    +      case 10 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9))
    +        }
    +
    +      case 11 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10))
    +        }
    +
    +      case 12 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11))
    +        }
    +
    +      case 13 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12))
    +        }
    +
    +      case 14 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13))
    +        }
    +
    +      case 15 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14))
    +        }
    +
    +      case 16 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15))
    +        }
    +
    +      case 17 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16))
    +        }
    +
    +      case 18 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17))
    +        }
    +
    +      case 19 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18))
    +        }
    +
    +      case 20 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19))
    +        }
    +
    +      case 21 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20))
    +        }
    +
    +      case 22 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20), convertedRow.get(21))
    +        }
    +    }
       }
     
       // scalastyle:on
     
    -  // Generate codes used to convert the arguments to Scala type for user-defined funtions
    -  private[this] def genCodeForConverter(ctx: CodeGenContext, index: Int): String = {
    -    val converterClassName = classOf[Any => Any].getName
    -    val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
    -    val expressionClassName = classOf[Expression].getName
    -    val scalaUDFClassName = classOf[ScalaUDF].getName
    -
    -    val converterTerm = ctx.freshName("converter")
    -    val expressionIdx = ctx.references.size - 1
    -    ctx.addMutableState(converterClassName, converterTerm,
    -      s"this.$converterTerm = ($converterClassName)$typeConvertersClassName" +
    -        s".createToScalaConverter(((${expressionClassName})((($scalaUDFClassName)" +
    -          s"expressions[$expressionIdx]).getChildren().apply($index))).dataType());")
    -    converterTerm
    -  }
    -
       override def genCode(
           ctx: CodeGenContext,
           ev: GeneratedExpressionCode): String = {
     
         ctx.references += this
    +    val scalaUDFTermIdx = ctx.references.size - 1
     
         val scalaUDFClassName = classOf[ScalaUDF].getName
         val converterClassName = classOf[Any => Any].getName
         val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
         val expressionClassName = classOf[Expression].getName
    -
    -    // Generate codes used to convert the returned value of user-defined functions to Catalyst type
    -    val catalystConverterTerm = ctx.freshName("catalystConverter")
    -    val catalystConverterTermIdx = ctx.references.size - 1
    -    ctx.addMutableState(converterClassName, catalystConverterTerm,
    -      s"this.$catalystConverterTerm = ($converterClassName)$typeConvertersClassName" +
    -        s".createToCatalystConverter((($scalaUDFClassName)expressions" +
    -          s"[$catalystConverterTermIdx]).dataType());")
    +    val expressionEncoderClassName = classOf[ExpressionEncoder[Row]].getName
    +    val rowEncoderClassName = RowEncoder.getClass.getName + ".MODULE$"
    +    val structTypeClassName = StructType.getClass.getName + ".MODULE$"
    +    val rowClassName = Row.getClass.getName + ".MODULE$"
    +    val rowClass = classOf[Row].getName
    +    val internalRowClassName = classOf[InternalRow].getName
    +    // scalastyle:off
    +    val javaConversionClassName = scala.collection.JavaConversions.getClass.getName + ".MODULE$"
    +    // scalastyle:on
    +
    +    // Generate code for input encoder
    +    val inputExpressionEncoderTerm = ctx.freshName("inputExpressionEncoder")
    +    ctx.addMutableState(expressionEncoderClassName, inputExpressionEncoderTerm,
    +      s"this.$inputExpressionEncoderTerm = ($expressionEncoderClassName)$rowEncoderClassName" +
    +        s".apply((($scalaUDFClassName)expressions" +
    +          s"[$scalaUDFTermIdx]).getInputSchema());")
    +
    +    // Generate code for output encoder
    +    val outputExpressionEncoderTerm = ctx.freshName("outputExpressionEncoder")
    +    ctx.addMutableState(expressionEncoderClassName, outputExpressionEncoderTerm,
    +      s"this.$outputExpressionEncoderTerm = ($expressionEncoderClassName)$rowEncoderClassName" +
    +        s".apply((($scalaUDFClassName)expressions[$scalaUDFTermIdx]).getDataType());")
     
         val resultTerm = ctx.freshName("result")
     
    -    // This must be called before children expressions' codegen
    -    // because ctx.references is used in genCodeForConverter
    -    val converterTerms = (0 until children.size).map(genCodeForConverter(ctx, _))
    -
         // Initialize user-defined function
         val funcClassName = s"scala.Function${children.size}"
     
         val funcTerm = ctx.freshName("udf")
    -    val funcExpressionIdx = ctx.references.size - 1
         ctx.addMutableState(funcClassName, funcTerm,
           s"this.$funcTerm = ($funcClassName)((($scalaUDFClassName)expressions" +
    -        s"[$funcExpressionIdx]).userDefinedFunc());")
    +        s"[$scalaUDFTermIdx]).userDefinedFunc());")
     
         // codegen for children expressions
         val evals = children.map(_.gen(ctx))
    +    val evalsArgs = evals.map(_.value).mkString(", ")
    +    val evalsAsSeq = s"$javaConversionClassName.asScalaIterable" +
    +      s"(java.util.Arrays.asList($evalsArgs)).toList()"
    +    val inputInternalRowTerm = ctx.freshName("inputRow")
    +    val inputInternalRow = s"$rowClass $inputInternalRowTerm = " +
    +      s"($rowClass)$inputExpressionEncoderTerm.fromRow(InternalRow.fromSeq($evalsAsSeq));"
    --- End diff --
    
    The reason to construct a row here is to encode expression results to scala objects using row encoder. Because UDF accepts and processes scala objects, we need to encode its input and decode its output.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207198914
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55266/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155151680
  
    **[Test build #45390 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45390/consoleFull)** for PR 9565 at commit [`75ffaeb`](https://github.com/apache/spark/commit/75ffaebc56c2bcc692c8924cd31c4b18a61f02c7).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206813074
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155288232
  
    cc @davies 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206813080
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55218/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155388569
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155709206
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155230012
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162851650
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/47330/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155857808
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206886587
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206788688
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44256945
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -37,21 +39,36 @@ case class ScalaUDF(
     
       override def toString: String = s"UDF(${children.mkString(",")})"
     
    +  // Accessors used in genCode
    --- End diff --
    
    Moved here because they should not be included in scalastyle:off section.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204431261
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54703/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206788433
  
    **[Test build #55210 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55210/consoleFull)** for PR 9565 at commit [`8dbc551`](https://github.com/apache/spark/commit/8dbc551f72c2a940f9fed54c0cfce5a208c3fc60).
     * This patch **fails MiMa tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155166055
  
    **[Test build #45390 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45390/consoleFull)** for PR 9565 at commit [`75ffaeb`](https://github.com/apache/spark/commit/75ffaebc56c2bcc692c8924cd31c4b18a61f02c7).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207198913
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206980072
  
    **[Test build #55226 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55226/consoleFull)** for PR 9565 at commit [`405e8b0`](https://github.com/apache/spark/commit/405e8b048f475bef2893f3be76b43516ab829954).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162756516
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155403444
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155403446
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45517/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162552952
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/47269/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-156840483
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-156494631
  
    @davies OK. I will separate the changes of RowEncoder in other PR.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44375480
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -60,924 +77,238 @@ case class ScalaUDF(
     
       */
     
    -  // Accessors used in genCode
    -  def userDefinedFunc(): AnyRef = function
    -  def getChildren(): Seq[Expression] = children
    -
    -  private[this] val f = children.size match {
    -    case 0 =>
    -      val func = function.asInstanceOf[() => Any]
    -      (input: InternalRow) => {
    -        func()
    -      }
    -
    -    case 1 =>
    -      val func = function.asInstanceOf[(Any) => Any]
    -      val child0 = children(0)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)))
    -      }
    -
    -    case 2 =>
    -      val func = function.asInstanceOf[(Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)))
    -      }
    -
    -    case 3 =>
    -      val func = function.asInstanceOf[(Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)))
    -      }
    -
    -    case 4 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)))
    -      }
    -
    -    case 5 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)))
    -      }
    -
    -    case 6 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)))
    -      }
    -
    -    case 7 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)))
    -      }
    -
    -    case 8 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)))
    -      }
    -
    -    case 9 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)))
    -      }
    -
    -    case 10 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)))
    -      }
    -
    -    case 11 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)))
    -      }
    -
    -    case 12 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)))
    -      }
    -
    -    case 13 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)))
    -      }
    -
    -    case 14 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)))
    -      }
    -
    -    case 15 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)))
    -      }
    -
    -    case 16 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)))
    -      }
    -
    -    case 17 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)))
    -      }
    -
    -    case 18 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)))
    -      }
    -
    -    case 19 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)))
    -      }
    -
    -    case 20 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)))
    -      }
    -
    -    case 21 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)))
    -      }
    -
    -    case 22 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      val child21 = children(21)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      lazy val converter21 = CatalystTypeConverters.createToScalaConverter(child21.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)),
    -          converter21(child21.eval(input)))
    -      }
    +  private[this] val f = {
    +    lazy val inputEncoder: ExpressionEncoder[Row] = RowEncoder(inputSchema)
    +    children.size match {
    +      case 0 =>
    +        val func = function.asInstanceOf[() => Any]
    +        (input: InternalRow) => {
    +          func()
    +        }
    +
    +      case 1 =>
    +        val func = function.asInstanceOf[(Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0))
    +        }
    +
    +      case 2 =>
    +        val func = function.asInstanceOf[(Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1))
    +        }
    +
    +      case 3 =>
    +        val func = function.asInstanceOf[(Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2))
    +        }
    +
    +      case 4 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3))
    +        }
    +
    +      case 5 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4))
    +        }
    +
    +      case 6 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5))
    +        }
    +
    +      case 7 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6))
    +        }
    +
    +      case 8 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7))
    +        }
    +
    +      case 9 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8))
    +        }
    +
    +      case 10 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9))
    +        }
    +
    +      case 11 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10))
    +        }
    +
    +      case 12 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11))
    +        }
    +
    +      case 13 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12))
    +        }
    +
    +      case 14 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13))
    +        }
    +
    +      case 15 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14))
    +        }
    +
    +      case 16 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15))
    +        }
    +
    +      case 17 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16))
    +        }
    +
    +      case 18 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17))
    +        }
    +
    +      case 19 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18))
    +        }
    +
    +      case 20 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19))
    +        }
    +
    +      case 21 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20))
    +        }
    +
    +      case 22 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20), convertedRow.get(21))
    +        }
    +    }
       }
     
       // scalastyle:on
     
       // Generate codes used to convert the arguments to Scala type for user-defined funtions
    -  private[this] def genCodeForConverter(ctx: CodeGenContext, index: Int): String = {
    +  private[this] def genCodeForConverter(
    +      ctx: CodeGenContext,
    +      scalaUDFTermIdx: Int,
    +      index: Int): String = {
         val converterClassName = classOf[Any => Any].getName
         val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
    --- End diff --
    
    When I finish the work on generated ScalaUDF, this can be removed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155246523
  
    **[Test build #45449 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45449/consoleFull)** for PR 9565 at commit [`1e13ff9`](https://github.com/apache/spark/commit/1e13ff98d1458b78c5cb8cb187bad4d91dc1143e).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206788478
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206711287
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155288227
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45475/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162801792
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/47315/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155229982
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-208147148
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55501/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162552814
  
    **[Test build #47269 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47269/consoleFull)** for PR 9565 at commit [`26b4d85`](https://github.com/apache/spark/commit/26b4d8583980f671513c7d4d532260bce5b86667).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162851649
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206703334
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55183/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #9565: [SPARK-11593][SQL] Replace catalyst converter with RowEnc...

Posted by koertkuipers <gi...@git.apache.org>.
Github user koertkuipers commented on the issue:

    https://github.com/apache/spark/pull/9565
  
    i think this would be very helpful. the difference in behaviour of scala udfs and scala functions used in dataset transformations is a constant source of confusion for my users. 
    
    for example the lack of support for Option to declare nullable input types, and the need to use untyped Row objects in UDFs for structs are inconsistent with how things are done when Encoders are used.  


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by davies <gi...@git.apache.org>.
Github user davies commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44374870
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -60,924 +77,238 @@ case class ScalaUDF(
     
       */
     
    -  // Accessors used in genCode
    -  def userDefinedFunc(): AnyRef = function
    -  def getChildren(): Seq[Expression] = children
    -
    -  private[this] val f = children.size match {
    -    case 0 =>
    -      val func = function.asInstanceOf[() => Any]
    -      (input: InternalRow) => {
    -        func()
    -      }
    -
    -    case 1 =>
    -      val func = function.asInstanceOf[(Any) => Any]
    -      val child0 = children(0)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)))
    -      }
    -
    -    case 2 =>
    -      val func = function.asInstanceOf[(Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)))
    -      }
    -
    -    case 3 =>
    -      val func = function.asInstanceOf[(Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)))
    -      }
    -
    -    case 4 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)))
    -      }
    -
    -    case 5 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)))
    -      }
    -
    -    case 6 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)))
    -      }
    -
    -    case 7 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)))
    -      }
    -
    -    case 8 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)))
    -      }
    -
    -    case 9 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)))
    -      }
    -
    -    case 10 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)))
    -      }
    -
    -    case 11 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)))
    -      }
    -
    -    case 12 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)))
    -      }
    -
    -    case 13 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)))
    -      }
    -
    -    case 14 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)))
    -      }
    -
    -    case 15 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)))
    -      }
    -
    -    case 16 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)))
    -      }
    -
    -    case 17 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)))
    -      }
    -
    -    case 18 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)))
    -      }
    -
    -    case 19 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)))
    -      }
    -
    -    case 20 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)))
    -      }
    -
    -    case 21 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)))
    -      }
    -
    -    case 22 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      val child21 = children(21)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      lazy val converter21 = CatalystTypeConverters.createToScalaConverter(child21.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)),
    -          converter21(child21.eval(input)))
    -      }
    +  private[this] val f = {
    +    lazy val inputEncoder: ExpressionEncoder[Row] = RowEncoder(inputSchema)
    +    children.size match {
    +      case 0 =>
    +        val func = function.asInstanceOf[() => Any]
    +        (input: InternalRow) => {
    +          func()
    +        }
    +
    +      case 1 =>
    +        val func = function.asInstanceOf[(Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0))
    +        }
    +
    +      case 2 =>
    +        val func = function.asInstanceOf[(Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1))
    +        }
    +
    +      case 3 =>
    +        val func = function.asInstanceOf[(Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2))
    +        }
    +
    +      case 4 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3))
    +        }
    +
    +      case 5 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4))
    +        }
    +
    +      case 6 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5))
    +        }
    +
    +      case 7 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6))
    +        }
    +
    +      case 8 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7))
    +        }
    +
    +      case 9 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8))
    +        }
    +
    +      case 10 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9))
    +        }
    +
    +      case 11 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10))
    +        }
    +
    +      case 12 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11))
    +        }
    +
    +      case 13 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12))
    +        }
    +
    +      case 14 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13))
    +        }
    +
    +      case 15 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14))
    +        }
    +
    +      case 16 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15))
    +        }
    +
    +      case 17 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16))
    +        }
    +
    +      case 18 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17))
    +        }
    +
    +      case 19 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18))
    +        }
    +
    +      case 20 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19))
    +        }
    +
    +      case 21 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20))
    +        }
    +
    +      case 22 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20), convertedRow.get(21))
    +        }
    +    }
       }
     
       // scalastyle:on
     
       // Generate codes used to convert the arguments to Scala type for user-defined funtions
    -  private[this] def genCodeForConverter(ctx: CodeGenContext, index: Int): String = {
    +  private[this] def genCodeForConverter(
    +      ctx: CodeGenContext,
    +      scalaUDFTermIdx: Int,
    +      index: Int): String = {
         val converterClassName = classOf[Any => Any].getName
         val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
         val expressionClassName = classOf[Expression].getName
         val scalaUDFClassName = classOf[ScalaUDF].getName
     
         val converterTerm = ctx.freshName("converter")
    -    val expressionIdx = ctx.references.size - 1
         ctx.addMutableState(converterClassName, converterTerm,
           s"this.$converterTerm = ($converterClassName)$typeConvertersClassName" +
             s".createToScalaConverter(((${expressionClassName})((($scalaUDFClassName)" +
    -          s"expressions[$expressionIdx]).getChildren().apply($index))).dataType());")
    +          s"expressions[$scalaUDFTermIdx]).getChildren().apply($index))).dataType());")
    --- End diff --
    
    Is this a bug?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by davies <gi...@git.apache.org>.
Github user davies commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155329242
  
    @viirya Thanks for work on this. I think it's more important to generate the code for converter in generated ScalaUDF. 
    
    BTW, the RowEncoder is new in 1.6 (experimental feature), so I'd like to only merge this into master. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155254834
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162852487
  
    Don't know why it fails on Jenkins but passes on local.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206811488
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155717828
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207215234
  
    Build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162441582
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206703298
  
    **[Test build #55183 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55183/consoleFull)** for PR 9565 at commit [`60f4ca0`](https://github.com/apache/spark/commit/60f4ca00792dcbddb55c25d7e18bd61aeefacd28).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155697632
  
    **[Test build #45616 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45616/consoleFull)** for PR 9565 at commit [`39c0b7a`](https://github.com/apache/spark/commit/39c0b7ad805b04235755b0d339635dae0e7ffbec).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155409419
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155805512
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-208147682
  
    **[Test build #55502 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55502/consoleFull)** for PR 9565 at commit [`30a867e`](https://github.com/apache/spark/commit/30a867e5a1910d9f5fbf43924556aee2d0694bbb).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155023761
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155442391
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155135946
  
    **[Test build #45382 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45382/consoleFull)** for PR 9565 at commit [`39f6c26`](https://github.com/apache/spark/commit/39f6c26bce822d1a7cb1a5174f5da0c65cf9977b).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155166246
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155287820
  
    **[Test build #45475 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45475/consoleFull)** for PR 9565 at commit [`07ff97a`](https://github.com/apache/spark/commit/07ff97ad563b63428dc4395f50471df7607818ca).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204431258
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206720173
  
    **[Test build #55195 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55195/consoleFull)** for PR 9565 at commit [`7a046fa`](https://github.com/apache/spark/commit/7a046fa3864e8e49860205e2ed47e65de99acb4c).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207298352
  
    @davies @rxin This stays here for a while. Recently I re-visit it and fix a previous problem. Can you take a look again? Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155703687
  
    **[Test build #45619 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45619/consoleFull)** for PR 9565 at commit [`c910e6e`](https://github.com/apache/spark/commit/c910e6edf9e0d9b7307c981413602706be2f14de).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155739783
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206720343
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55195/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r59133768
  
    --- Diff: mllib/src/main/scala/org/apache/spark/ml/Transformer.scala ---
    @@ -90,7 +90,7 @@ abstract class UnaryTransformer[IN, OUT, T <: UnaryTransformer[IN, OUT, T]]
        * account of the embedded param map. So the param values should be determined solely by the input
        * param map.
        */
    -  protected def createTransformFunc: IN => OUT
    +  protected val createTransformFunc: (T, IN) => OUT
    --- End diff --
    
    After re-thinking about this change, I think it is not a good approach. It works but hacky.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206809188
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206784605
  
    **[Test build #55210 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55210/consoleFull)** for PR 9565 at commit [`8dbc551`](https://github.com/apache/spark/commit/8dbc551f72c2a940f9fed54c0cfce5a208c3fc60).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44850992
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/Row.scala ---
    @@ -306,7 +306,15 @@ trait Row extends Serializable {
        *
        * @throws ClassCastException when data type does not match.
        */
    -  def getStruct(i: Int): Row = getAs[Row](i)
    +  def getStruct(i: Int): Row = {
    +    // Product and Row both are recoginized as StructType in a Row
    +    val t = get(i)
    +    if (t.isInstanceOf[Product]) {
    +      Row.fromTuple(t.asInstanceOf[Product])
    +    } else {
    +      t.asInstanceOf[Row]
    +    }
    +  }
    --- End diff --
    
    I think we also need to update the javadoc of `Row` to say that `Product` is also a valid value type of `StructType`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by davies <gi...@git.apache.org>.
Github user davies commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44808525
  
    --- Diff: sql/core/src/test/scala/org/apache/spark/sql/UDFSuite.scala ---
    @@ -204,32 +204,45 @@ class UDFSuite extends QueryTest with SharedSQLContext {
         sqlContext.udf.register("complexDataFunc",
           (m: Map[String, Int], a: Seq[Int], b: Boolean) => { (m, a, b) } )
     
    -    checkAnswer(
    -      sql("SELECT tmp.t.* FROM (SELECT testDataFunc(key, value) AS t from testData) tmp").toDF(),
    -      testData)
    -    checkAnswer(
    -      sql("""
    -           | SELECT tmp.t.* FROM
    -           | (SELECT decimalDataFunc(a, b) AS t FROM decimalData) tmp
    -          """.stripMargin).toDF(), decimalData)
    -    checkAnswer(
    -      sql("""
    -           | SELECT tmp.t.* FROM
    -           | (SELECT binaryDataFunc(a, b) AS t FROM binaryData) tmp
    -          """.stripMargin).toDF(), binaryData)
    -    checkAnswer(
    -      sql("""
    -           | SELECT tmp.t.* FROM
    -           | (SELECT arrayDataFunc(data, nestedData) AS t FROM arrayData) tmp
    -          """.stripMargin).toDF(), arrayData.toDF())
    -    checkAnswer(
    -      sql("""
    -           | SELECT mapDataFunc(data) AS t FROM mapData
    -          """.stripMargin).toDF(), mapData.toDF())
    -    checkAnswer(
    -      sql("""
    -           | SELECT tmp.t.* FROM
    -           | (SELECT complexDataFunc(m, a, b) AS t FROM complexData) tmp
    -          """.stripMargin).toDF(), complexData.select("m", "a", "b"))
    +    def udfTest(): Unit = {
    +      checkAnswer(
    +        sql("SELECT tmp.t.* FROM (SELECT testDataFunc(key, value) AS t from testData) tmp").toDF(),
    +        testData)
    +      checkAnswer(
    +        sql("""
    +             | SELECT tmp.t.* FROM
    +             | (SELECT decimalDataFunc(a, b) AS t FROM decimalData) tmp
    +            """.stripMargin).toDF(), decimalData)
    +      checkAnswer(
    +        sql("""
    +             | SELECT tmp.t.* FROM
    +             | (SELECT binaryDataFunc(a, b) AS t FROM binaryData) tmp
    +            """.stripMargin).toDF(), binaryData)
    +      checkAnswer(
    +        sql("""
    +             | SELECT tmp.t.* FROM
    +             | (SELECT arrayDataFunc(data, nestedData) AS t FROM arrayData) tmp
    +            """.stripMargin).toDF(), arrayData.toDF())
    +      checkAnswer(
    +        sql("""
    +             | SELECT mapDataFunc(data) AS t FROM mapData
    +            """.stripMargin).toDF(), mapData.toDF())
    +      checkAnswer(
    +        sql("""
    +             | SELECT tmp.t.* FROM
    +             | (SELECT complexDataFunc(m, a, b) AS t FROM complexData) tmp
    +            """.stripMargin).toDF(), complexData.select("m", "a", "b"))
    +    }
    +
    +    withSQLConf(SQLConf.UNSAFE_ENABLED.key -> "true") {
    --- End diff --
    
    These two config are removed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162842862
  
    **[Test build #47330 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47330/consoleFull)** for PR 9565 at commit [`1ca2efc`](https://github.com/apache/spark/commit/1ca2efcaa2eb511a398b5db67dee5f340a524de2).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162552947
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155390254
  
    **[Test build #45517 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45517/consoleFull)** for PR 9565 at commit [`ecf01bf`](https://github.com/apache/spark/commit/ecf01bfb070644337681631c08408e0610a433e8).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155797574
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155704015
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207262916
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162801753
  
    **[Test build #47315 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47315/consoleFull)** for PR 9565 at commit [`693a6fe`](https://github.com/apache/spark/commit/693a6fed9fdb8002d4e099bd106ec1b44dbdc86d).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162446599
  
    **[Test build #47259 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47259/consoleFull)** for PR 9565 at commit [`f806755`](https://github.com/apache/spark/commit/f80675504a34d5bc0b830b5d25eb2f7f9a637d74).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155857610
  
    **[Test build #45641 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45641/consoleFull)** for PR 9565 at commit [`1234515`](https://github.com/apache/spark/commit/12345150cff5c02780e0055600b584a0aeaaf441).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207215071
  
    **[Test build #55300 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55300/consoleFull)** for PR 9565 at commit [`648c7b2`](https://github.com/apache/spark/commit/648c7b22dbea6de9b4a57fab87941bb8cac268bb).
     * This patch **fails Spark unit tests**.
     * This patch **does not merge cleanly**.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162749182
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206708283
  
    Ok. Finally solve weird runtime mirror problem.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155133503
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155133543
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155704013
  
    **[Test build #45619 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45619/consoleFull)** for PR 9565 at commit [`c910e6e`](https://github.com/apache/spark/commit/c910e6edf9e0d9b7307c981413602706be2f14de).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:\n  * `sealed abstract class State[S] `\n  * `sealed abstract class StateSpec[KeyType, ValueType, StateType, EmittedType] extends Serializable `\n  * `case class StateSpecImpl[K, V, S, T](`\n  * `sealed abstract class TrackStateDStream[KeyType, ValueType, StateType, EmittedType: ClassTag](`\n  * `class InternalTrackStateDStream[K: ClassTag, V: ClassTag, S: ClassTag, E: ClassTag](`\n  * `  case class StateInfo[S](`\n  * `  class LimitMarker(val num: Int) extends Serializable`\n


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155717783
  
    **[Test build #45621 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45621/consoleFull)** for PR 9565 at commit [`1234515`](https://github.com/apache/spark/commit/12345150cff5c02780e0055600b584a0aeaaf441).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155857809
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45641/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155709195
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by davies <gi...@git.apache.org>.
Github user davies commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44374944
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -60,924 +77,238 @@ case class ScalaUDF(
     
       */
     
    -  // Accessors used in genCode
    -  def userDefinedFunc(): AnyRef = function
    -  def getChildren(): Seq[Expression] = children
    -
    -  private[this] val f = children.size match {
    -    case 0 =>
    -      val func = function.asInstanceOf[() => Any]
    -      (input: InternalRow) => {
    -        func()
    -      }
    -
    -    case 1 =>
    -      val func = function.asInstanceOf[(Any) => Any]
    -      val child0 = children(0)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)))
    -      }
    -
    -    case 2 =>
    -      val func = function.asInstanceOf[(Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)))
    -      }
    -
    -    case 3 =>
    -      val func = function.asInstanceOf[(Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)))
    -      }
    -
    -    case 4 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)))
    -      }
    -
    -    case 5 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)))
    -      }
    -
    -    case 6 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)))
    -      }
    -
    -    case 7 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)))
    -      }
    -
    -    case 8 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)))
    -      }
    -
    -    case 9 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)))
    -      }
    -
    -    case 10 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)))
    -      }
    -
    -    case 11 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)))
    -      }
    -
    -    case 12 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)))
    -      }
    -
    -    case 13 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)))
    -      }
    -
    -    case 14 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)))
    -      }
    -
    -    case 15 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)))
    -      }
    -
    -    case 16 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)))
    -      }
    -
    -    case 17 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)))
    -      }
    -
    -    case 18 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)))
    -      }
    -
    -    case 19 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)))
    -      }
    -
    -    case 20 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)))
    -      }
    -
    -    case 21 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)))
    -      }
    -
    -    case 22 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      val child21 = children(21)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      lazy val converter21 = CatalystTypeConverters.createToScalaConverter(child21.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)),
    -          converter21(child21.eval(input)))
    -      }
    +  private[this] val f = {
    +    lazy val inputEncoder: ExpressionEncoder[Row] = RowEncoder(inputSchema)
    +    children.size match {
    +      case 0 =>
    +        val func = function.asInstanceOf[() => Any]
    +        (input: InternalRow) => {
    +          func()
    +        }
    +
    +      case 1 =>
    +        val func = function.asInstanceOf[(Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0))
    +        }
    +
    +      case 2 =>
    +        val func = function.asInstanceOf[(Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1))
    +        }
    +
    +      case 3 =>
    +        val func = function.asInstanceOf[(Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2))
    +        }
    +
    +      case 4 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3))
    +        }
    +
    +      case 5 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4))
    +        }
    +
    +      case 6 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5))
    +        }
    +
    +      case 7 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6))
    +        }
    +
    +      case 8 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7))
    +        }
    +
    +      case 9 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8))
    +        }
    +
    +      case 10 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9))
    +        }
    +
    +      case 11 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10))
    +        }
    +
    +      case 12 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11))
    +        }
    +
    +      case 13 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12))
    +        }
    +
    +      case 14 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13))
    +        }
    +
    +      case 15 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14))
    +        }
    +
    +      case 16 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15))
    +        }
    +
    +      case 17 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16))
    +        }
    +
    +      case 18 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17))
    +        }
    +
    +      case 19 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18))
    +        }
    +
    +      case 20 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19))
    +        }
    +
    +      case 21 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20))
    +        }
    +
    +      case 22 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20), convertedRow.get(21))
    +        }
    +    }
       }
     
       // scalastyle:on
     
       // Generate codes used to convert the arguments to Scala type for user-defined funtions
    -  private[this] def genCodeForConverter(ctx: CodeGenContext, index: Int): String = {
    +  private[this] def genCodeForConverter(
    +      ctx: CodeGenContext,
    +      scalaUDFTermIdx: Int,
    +      index: Int): String = {
         val converterClassName = classOf[Any => Any].getName
         val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
    --- End diff --
    
    Should we generate the code for converter?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44257018
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -994,26 +326,24 @@ case class ScalaUDF(
     
         // Generate codes used to convert the returned value of user-defined functions to Catalyst type
         val catalystConverterTerm = ctx.freshName("catalystConverter")
    -    val catalystConverterTermIdx = ctx.references.size - 1
    --- End diff --
    
    Remove redundant term index.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204413889
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54702/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155409355
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r46292490
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -1047,6 +391,11 @@ case class ScalaUDF(
         """
       }
     
    -  private[this] val converter = CatalystTypeConverters.createToCatalystConverter(dataType)
    -  override def eval(input: InternalRow): Any = converter(f(input))
    +  lazy val outputEncoder: ExpressionEncoder[Row] =
    +      RowEncoder(StructType(StructField("_c0", dataType) :: Nil))
    +
    +  override def eval(input: InternalRow): Any = {
    +    val projected = InternalRow.fromSeq(children.map(_.eval(input)))
    +    outputEncoder.toRow(Row(f(projected))).copy().asInstanceOf[InternalRow].get(0, dataType)
    --- End diff --
    
    Because `toRow` are to return the same actual `InternalRow` object in `ExpressionEncoder`. We need to copy the result. Otherwise, as I tried, the results will be incorrect.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155151210
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204644454
  
    **[Test build #54762 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54762/consoleFull)** for PR 9565 at commit [`5da1c13`](https://github.com/apache/spark/commit/5da1c1390e6da81321f2be76e20c92ee03c05ebf).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155697390
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-156646254
  
    @davies @cloud-fan the changes of RowEncoder is separated as #9712.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-208136271
  
    **[Test build #55501 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55501/consoleFull)** for PR 9565 at commit [`884a176`](https://github.com/apache/spark/commit/884a176a0e19ccbaf42f98ad5c1f256dbc10f92b).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207295641
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55325/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162446647
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155410805
  
    **[Test build #45520 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45520/consoleFull)** for PR 9565 at commit [`ecf01bf`](https://github.com/apache/spark/commit/ecf01bfb070644337681631c08408e0610a433e8).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155797524
  
    **[Test build #45628 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45628/consoleFull)** for PR 9565 at commit [`1234515`](https://github.com/apache/spark/commit/12345150cff5c02780e0055600b584a0aeaaf441).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by davies <gi...@git.apache.org>.
Github user davies commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-156493860
  
    @viirya This looks good to me overall, but it's kind of risk to merge into 1.6, I'd like only merge it into master (1.7). The changes in RowEncoder could be useful for 1.6, could you separate that part (also add some tests for them)?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-156840485
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45954/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155264236
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44509246
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/Row.scala ---
    @@ -306,7 +306,15 @@ trait Row extends Serializable {
        *
        * @throws ClassCastException when data type does not match.
        */
    -  def getStruct(i: Int): Row = getAs[Row](i)
    +  def getStruct(i: Int): Row = {
    +    // Product and Row both are recoginized as StructType in a Row
    +    val t = get(i)
    +    if (t.isInstanceOf[Product]) {
    +      Row.fromTuple(t.asInstanceOf[Product])
    +    } else {
    +      t.asInstanceOf[Row]
    +    }
    +  }
    --- End diff --
    
    We use `schemaFor` to get a catalyst DataType for udf's return type. For `Product` type, we return a `StructType` now. That causes problem in `RowEncoder` because `RowEncoder` will try to get a `Row` not a `Product` for a field of `StructType`. You will get a casting exception if your udf returns something like `(1, 2)`.
    
    The problem is a field of `StructType` in a `Row` can be a `Product` or a `Row`. I modified the `getStruct` method in `Row` to turn a `Row` for a `Product`.
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162439812
  
    **[Test build #47259 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47259/consoleFull)** for PR 9565 at commit [`f806755`](https://github.com/apache/spark/commit/f80675504a34d5bc0b830b5d25eb2f7f9a637d74).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162795966
  
    **[Test build #47315 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47315/consoleFull)** for PR 9565 at commit [`693a6fe`](https://github.com/apache/spark/commit/693a6fed9fdb8002d4e099bd106ec1b44dbdc86d).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155702976
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206812471
  
    **[Test build #55218 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55218/consoleFull)** for PR 9565 at commit [`597c971`](https://github.com/apache/spark/commit/597c9713f1262d18b853b5876614a658b3bb61d3).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204413887
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204413882
  
    **[Test build #54702 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54702/consoleFull)** for PR 9565 at commit [`b8f3cce`](https://github.com/apache/spark/commit/b8f3ccea311a777fa587a4353c0e884e8a275e9c).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155136452
  
    **[Test build #45382 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45382/consoleFull)** for PR 9565 at commit [`39f6c26`](https://github.com/apache/spark/commit/39f6c26bce822d1a7cb1a5174f5da0c65cf9977b).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155805485
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204648933
  
    **[Test build #54762 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54762/consoleFull)** for PR 9565 at commit [`5da1c13`](https://github.com/apache/spark/commit/5da1c1390e6da81321f2be76e20c92ee03c05ebf).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155740990
  
    **[Test build #45628 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45628/consoleFull)** for PR 9565 at commit [`1234515`](https://github.com/apache/spark/commit/12345150cff5c02780e0055600b584a0aeaaf441).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r58844750
  
    --- Diff: mllib/src/main/scala/org/apache/spark/ml/Transformer.scala ---
    @@ -90,7 +90,7 @@ abstract class UnaryTransformer[IN, OUT, T <: UnaryTransformer[IN, OUT, T]]
        * account of the embedded param map. So the param values should be determined solely by the input
        * param map.
        */
    -  protected def createTransformFunc: IN => OUT
    +  protected val createTransformFunc: (T, IN) => OUT
    --- End diff --
    
    The UDF obtained with `def createTransformFunc` can't work with runtime mirror. I tried many times. Only make it as a `val` can work.
    
    As `Transformer` can update its params, we need to pass in `T` to UDF to get the updated params.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207262556
  
    **[Test build #55301 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55301/consoleFull)** for PR 9565 at commit [`2a0c319`](https://github.com/apache/spark/commit/2a0c3192e1e7872a800913fe42e4b02ed622b37c).
     * This patch **fails from timeout after a configured wait of \`250m\`**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya closed the pull request at:

    https://github.com/apache/spark/pull/9565


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-208147724
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55502/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r46293111
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -37,21 +39,36 @@ case class ScalaUDF(
     
       override def toString: String = s"UDF(${children.mkString(",")})"
     
    +  // Accessors used in genCode
    --- End diff --
    
    Ok. Let me try to use that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206703332
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-208151820
  
    hmm. We can't remove non code-generated version of ScalaUDF as any InterpretedProjection with udf will fail...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206706600
  
    **[Test build #55185 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55185/consoleFull)** for PR 9565 at commit [`7a046fa`](https://github.com/apache/spark/commit/7a046fa3864e8e49860205e2ed47e65de99acb4c).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207295894
  
    Finally...tests passed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207188890
  
    **[Test build #55301 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55301/consoleFull)** for PR 9565 at commit [`2a0c319`](https://github.com/apache/spark/commit/2a0c3192e1e7872a800913fe42e4b02ed622b37c).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206720335
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206809192
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55217/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-208147121
  
    **[Test build #55501 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55501/consoleFull)** for PR 9565 at commit [`884a176`](https://github.com/apache/spark/commit/884a176a0e19ccbaf42f98ad5c1f256dbc10f92b).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r59133802
  
    --- Diff: mllib/src/main/scala/org/apache/spark/ml/Transformer.scala ---
    @@ -90,7 +90,7 @@ abstract class UnaryTransformer[IN, OUT, T <: UnaryTransformer[IN, OUT, T]]
        * account of the embedded param map. So the param values should be determined solely by the input
        * param map.
        */
    -  protected def createTransformFunc: IN => OUT
    +  protected val createTransformFunc: (T, IN) => OUT
    --- End diff --
    
    This change is to make non-code-generated evaluation of `ScalaUDF` works with `RowEncoder` due to runtime mirror problem. I am thinking is it possible that we only support code-generated evaluation of `ScalaUDF`? Then we can avoid this change to `Transformer`. What you think? @rxin @davies Can you give some suggestions? Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206711386
  
    **[Test build #55195 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55195/consoleFull)** for PR 9565 at commit [`7a046fa`](https://github.com/apache/spark/commit/7a046fa3864e8e49860205e2ed47e65de99acb4c).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162756413
  
    **[Test build #47304 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47304/consoleFull)** for PR 9565 at commit [`26b4d85`](https://github.com/apache/spark/commit/26b4d8583980f671513c7d4d532260bce5b86667).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207215236
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55300/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155264787
  
    **[Test build #45475 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45475/consoleFull)** for PR 9565 at commit [`07ff97a`](https://github.com/apache/spark/commit/07ff97ad563b63428dc4395f50471df7607818ca).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204413494
  
    **[Test build #54702 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54702/consoleFull)** for PR 9565 at commit [`b8f3cce`](https://github.com/apache/spark/commit/b8f3ccea311a777fa587a4353c0e884e8a275e9c).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155017754
  
    **[Test build #45358 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45358/consoleFull)** for PR 9565 at commit [`942dad7`](https://github.com/apache/spark/commit/942dad7707aa250de55dfe4d873400cb0418dcdd).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206706621
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155136457
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206809144
  
    **[Test build #55217 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55217/consoleFull)** for PR 9565 at commit [`597c971`](https://github.com/apache/spark/commit/597c9713f1262d18b853b5876614a658b3bb61d3).
     * This patch **fails MiMa tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155264258
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155698009
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45616/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44375262
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -60,924 +77,238 @@ case class ScalaUDF(
     
       */
     
    -  // Accessors used in genCode
    -  def userDefinedFunc(): AnyRef = function
    -  def getChildren(): Seq[Expression] = children
    -
    -  private[this] val f = children.size match {
    -    case 0 =>
    -      val func = function.asInstanceOf[() => Any]
    -      (input: InternalRow) => {
    -        func()
    -      }
    -
    -    case 1 =>
    -      val func = function.asInstanceOf[(Any) => Any]
    -      val child0 = children(0)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)))
    -      }
    -
    -    case 2 =>
    -      val func = function.asInstanceOf[(Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)))
    -      }
    -
    -    case 3 =>
    -      val func = function.asInstanceOf[(Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)))
    -      }
    -
    -    case 4 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)))
    -      }
    -
    -    case 5 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)))
    -      }
    -
    -    case 6 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)))
    -      }
    -
    -    case 7 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)))
    -      }
    -
    -    case 8 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)))
    -      }
    -
    -    case 9 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)))
    -      }
    -
    -    case 10 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)))
    -      }
    -
    -    case 11 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)))
    -      }
    -
    -    case 12 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)))
    -      }
    -
    -    case 13 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)))
    -      }
    -
    -    case 14 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)))
    -      }
    -
    -    case 15 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)))
    -      }
    -
    -    case 16 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)))
    -      }
    -
    -    case 17 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)))
    -      }
    -
    -    case 18 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)))
    -      }
    -
    -    case 19 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)))
    -      }
    -
    -    case 20 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)))
    -      }
    -
    -    case 21 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)))
    -      }
    -
    -    case 22 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      val child21 = children(21)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      lazy val converter21 = CatalystTypeConverters.createToScalaConverter(child21.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)),
    -          converter21(child21.eval(input)))
    -      }
    +  private[this] val f = {
    +    lazy val inputEncoder: ExpressionEncoder[Row] = RowEncoder(inputSchema)
    +    children.size match {
    +      case 0 =>
    +        val func = function.asInstanceOf[() => Any]
    +        (input: InternalRow) => {
    +          func()
    +        }
    +
    +      case 1 =>
    +        val func = function.asInstanceOf[(Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0))
    +        }
    +
    +      case 2 =>
    +        val func = function.asInstanceOf[(Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1))
    +        }
    +
    +      case 3 =>
    +        val func = function.asInstanceOf[(Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2))
    +        }
    +
    +      case 4 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3))
    +        }
    +
    +      case 5 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4))
    +        }
    +
    +      case 6 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5))
    +        }
    +
    +      case 7 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6))
    +        }
    +
    +      case 8 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7))
    +        }
    +
    +      case 9 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8))
    +        }
    +
    +      case 10 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9))
    +        }
    +
    +      case 11 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10))
    +        }
    +
    +      case 12 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11))
    +        }
    +
    +      case 13 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12))
    +        }
    +
    +      case 14 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13))
    +        }
    +
    +      case 15 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14))
    +        }
    +
    +      case 16 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15))
    +        }
    +
    +      case 17 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16))
    +        }
    +
    +      case 18 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17))
    +        }
    +
    +      case 19 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18))
    +        }
    +
    +      case 20 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19))
    +        }
    +
    +      case 21 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20))
    +        }
    +
    +      case 22 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20), convertedRow.get(21))
    +        }
    +    }
       }
     
       // scalastyle:on
     
       // Generate codes used to convert the arguments to Scala type for user-defined funtions
    -  private[this] def genCodeForConverter(ctx: CodeGenContext, index: Int): String = {
    +  private[this] def genCodeForConverter(
    +      ctx: CodeGenContext,
    +      scalaUDFTermIdx: Int,
    +      index: Int): String = {
         val converterClassName = classOf[Any => Any].getName
         val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
         val expressionClassName = classOf[Expression].getName
         val scalaUDFClassName = classOf[ScalaUDF].getName
     
         val converterTerm = ctx.freshName("converter")
    -    val expressionIdx = ctx.references.size - 1
         ctx.addMutableState(converterClassName, converterTerm,
           s"this.$converterTerm = ($converterClassName)$typeConvertersClassName" +
             s".createToScalaConverter(((${expressionClassName})((($scalaUDFClassName)" +
    -          s"expressions[$expressionIdx]).getChildren().apply($index))).dataType());")
    +          s"expressions[$scalaUDFTermIdx]).getChildren().apply($index))).dataType());")
    --- End diff --
    
    They are redundant and should be as same as `scalaUDFTermIdx`.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155232635
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207295635
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155288224
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44851452
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/Row.scala ---
    @@ -306,7 +306,15 @@ trait Row extends Serializable {
        *
        * @throws ClassCastException when data type does not match.
        */
    -  def getStruct(i: Int): Row = getAs[Row](i)
    +  def getStruct(i: Int): Row = {
    +    // Product and Row both are recoginized as StructType in a Row
    +    val t = get(i)
    +    if (t.isInstanceOf[Product]) {
    +      Row.fromTuple(t.asInstanceOf[Product])
    +    } else {
    +      t.asInstanceOf[Row]
    +    }
    +  }
    --- End diff --
    
    ok.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207187370
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155403399
  
    **[Test build #45517 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45517/consoleFull)** for PR 9565 at commit [`ecf01bf`](https://github.com/apache/spark/commit/ecf01bfb070644337681631c08408e0610a433e8).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-208147721
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by marmbrus <gi...@git.apache.org>.
Github user marmbrus commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r46245609
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -37,21 +39,36 @@ case class ScalaUDF(
     
       override def toString: String = s"UDF(${children.mkString(",")})"
     
    +  // Accessors used in genCode
    --- End diff --
    
    Why do we need manually created accessors?  All of the arguments to a case class should have public methods already created for them.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155015384
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155697401
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155739429
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by rxin <gi...@git.apache.org>.
Github user rxin commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-211758161
  
    What's the problem with runtime mirror?



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207135289
  
    **[Test build #55266 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55266/consoleFull)** for PR 9565 at commit [`648c7b2`](https://github.com/apache/spark/commit/648c7b22dbea6de9b4a57fab87941bb8cac268bb).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207233864
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206696333
  
    **[Test build #55183 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55183/consoleFull)** for PR 9565 at commit [`60f4ca0`](https://github.com/apache/spark/commit/60f4ca00792dcbddb55c25d7e18bd61aeefacd28).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44513214
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -60,980 +79,305 @@ case class ScalaUDF(
     
       */
     
    -  // Accessors used in genCode
    -  def userDefinedFunc(): AnyRef = function
    -  def getChildren(): Seq[Expression] = children
    -
    -  private[this] val f = children.size match {
    -    case 0 =>
    -      val func = function.asInstanceOf[() => Any]
    -      (input: InternalRow) => {
    -        func()
    -      }
    -
    -    case 1 =>
    -      val func = function.asInstanceOf[(Any) => Any]
    -      val child0 = children(0)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)))
    -      }
    -
    -    case 2 =>
    -      val func = function.asInstanceOf[(Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)))
    -      }
    -
    -    case 3 =>
    -      val func = function.asInstanceOf[(Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)))
    -      }
    -
    -    case 4 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)))
    -      }
    -
    -    case 5 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)))
    -      }
    -
    -    case 6 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)))
    -      }
    -
    -    case 7 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)))
    -      }
    -
    -    case 8 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)))
    -      }
    -
    -    case 9 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)))
    -      }
    -
    -    case 10 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)))
    -      }
    -
    -    case 11 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)))
    -      }
    -
    -    case 12 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)))
    -      }
    -
    -    case 13 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)))
    -      }
    -
    -    case 14 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)))
    -      }
    -
    -    case 15 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)))
    -      }
    -
    -    case 16 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)))
    -      }
    -
    -    case 17 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)))
    -      }
    -
    -    case 18 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)))
    -      }
    -
    -    case 19 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)))
    -      }
    -
    -    case 20 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)))
    -      }
    -
    -    case 21 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)))
    -      }
    -
    -    case 22 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      val child21 = children(21)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      lazy val converter21 = CatalystTypeConverters.createToScalaConverter(child21.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)),
    -          converter21(child21.eval(input)))
    -      }
    +  private[this] val f = {
    +    lazy val inputEncoder: ExpressionEncoder[Row] = RowEncoder(inputSchema)
    +    children.size match {
    +      case 0 =>
    +        val func = function.asInstanceOf[() => Any]
    +        (input: InternalRow) => {
    +          func()
    +        }
    +
    +      case 1 =>
    +        val func = function.asInstanceOf[(Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0))
    +        }
    +
    +      case 2 =>
    +        val func = function.asInstanceOf[(Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1))
    +        }
    +
    +      case 3 =>
    +        val func = function.asInstanceOf[(Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2))
    +        }
    +
    +      case 4 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3))
    +        }
    +
    +      case 5 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4))
    +        }
    +
    +      case 6 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5))
    +        }
    +
    +      case 7 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6))
    +        }
    +
    +      case 8 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7))
    +        }
    +
    +      case 9 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8))
    +        }
    +
    +      case 10 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9))
    +        }
    +
    +      case 11 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10))
    +        }
    +
    +      case 12 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11))
    +        }
    +
    +      case 13 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12))
    +        }
    +
    +      case 14 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13))
    +        }
    +
    +      case 15 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14))
    +        }
    +
    +      case 16 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15))
    +        }
    +
    +      case 17 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16))
    +        }
    +
    +      case 18 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17))
    +        }
    +
    +      case 19 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18))
    +        }
    +
    +      case 20 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19))
    +        }
    +
    +      case 21 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20))
    +        }
    +
    +      case 22 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20), convertedRow.get(21))
    +        }
    +    }
       }
     
       // scalastyle:on
     
    -  // Generate codes used to convert the arguments to Scala type for user-defined funtions
    -  private[this] def genCodeForConverter(ctx: CodeGenContext, index: Int): String = {
    -    val converterClassName = classOf[Any => Any].getName
    -    val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
    -    val expressionClassName = classOf[Expression].getName
    -    val scalaUDFClassName = classOf[ScalaUDF].getName
    -
    -    val converterTerm = ctx.freshName("converter")
    -    val expressionIdx = ctx.references.size - 1
    -    ctx.addMutableState(converterClassName, converterTerm,
    -      s"this.$converterTerm = ($converterClassName)$typeConvertersClassName" +
    -        s".createToScalaConverter(((${expressionClassName})((($scalaUDFClassName)" +
    -          s"expressions[$expressionIdx]).getChildren().apply($index))).dataType());")
    -    converterTerm
    -  }
    -
       override def genCode(
           ctx: CodeGenContext,
           ev: GeneratedExpressionCode): String = {
     
         ctx.references += this
    +    val scalaUDFTermIdx = ctx.references.size - 1
     
         val scalaUDFClassName = classOf[ScalaUDF].getName
         val converterClassName = classOf[Any => Any].getName
         val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
         val expressionClassName = classOf[Expression].getName
    -
    -    // Generate codes used to convert the returned value of user-defined functions to Catalyst type
    -    val catalystConverterTerm = ctx.freshName("catalystConverter")
    -    val catalystConverterTermIdx = ctx.references.size - 1
    -    ctx.addMutableState(converterClassName, catalystConverterTerm,
    -      s"this.$catalystConverterTerm = ($converterClassName)$typeConvertersClassName" +
    -        s".createToCatalystConverter((($scalaUDFClassName)expressions" +
    -          s"[$catalystConverterTermIdx]).dataType());")
    +    val expressionEncoderClassName = classOf[ExpressionEncoder[Row]].getName
    +    val rowEncoderClassName = RowEncoder.getClass.getName + ".MODULE$"
    +    val structTypeClassName = StructType.getClass.getName + ".MODULE$"
    +    val rowClassName = Row.getClass.getName + ".MODULE$"
    +    val rowClass = classOf[Row].getName
    +    val internalRowClassName = classOf[InternalRow].getName
    +    // scalastyle:off
    +    val javaConversionClassName = scala.collection.JavaConversions.getClass.getName + ".MODULE$"
    --- End diff --
    
    `JavaConversions` has been banned for implicit conversion between Java and Scala types. However, we are not going to use it in Scala side but use it in generated Java codes. `JavaConverters` doesn't provide simple and direct method to call for the purpose here. So we turn off scalastyle here temporarily.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-209278318
  
    Too replace catalyst converter with RowEncoder in non code-generated ScalaUDF seems not doable due to runtime mirror limitation.
    
    ping @rxin Is it good that I revert the changes of non code-generated ScalaUDF here and just merge code-generated version?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by davies <gi...@git.apache.org>.
Github user davies commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44374607
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -1041,6 +371,10 @@ case class ScalaUDF(
         """
       }
     
    -  private[this] val converter = CatalystTypeConverters.createToCatalystConverter(dataType)
    -  override def eval(input: InternalRow): Any = converter(f(input))
    +  override def eval(input: InternalRow): Any = {
    +    val projected = InternalRow.fromSeq(children.map(_.eval(input)))
    +    val outputEncoder: ExpressionEncoder[Row] =
    --- End diff --
    
    `outputEncoder` should be created outside of `eval`, or it will be too slow.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162446652
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/47259/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155015316
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by cloud-fan <gi...@git.apache.org>.
Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r44850557
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/Row.scala ---
    @@ -306,7 +306,15 @@ trait Row extends Serializable {
        *
        * @throws ClassCastException when data type does not match.
        */
    -  def getStruct(i: Int): Row = getAs[Row](i)
    +  def getStruct(i: Int): Row = {
    +    // Product and Row both are recoginized as StructType in a Row
    +    val t = get(i)
    +    if (t.isInstanceOf[Product]) {
    +      Row.fromTuple(t.asInstanceOf[Product])
    +    } else {
    +      t.asInstanceOf[Row]
    +    }
    +  }
    --- End diff --
    
    This seems a bug to me if the field type is `StructType` but value type is `Product`. We use `schemaFor` to get the catalyst type, which is the field type for `InternalRow`. However, for the external `Row`, we will/should use `dataTypeFor`, which will return `ObjectType` for `Product`. And for `ObjectType`, we should use `Row.getAs[T]` to get the field. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206822919
  
    **[Test build #55219 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55219/consoleFull)** for PR 9565 at commit [`597c971`](https://github.com/apache/spark/commit/597c9713f1262d18b853b5876614a658b3bb61d3).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204648944
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155704019
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45619/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155698005
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204648946
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54762/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207188224
  
    **[Test build #55300 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55300/consoleFull)** for PR 9565 at commit [`648c7b2`](https://github.com/apache/spark/commit/648c7b22dbea6de9b4a57fab87941bb8cac268bb).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155739762
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162543995
  
    **[Test build #47269 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47269/consoleFull)** for PR 9565 at commit [`26b4d85`](https://github.com/apache/spark/commit/26b4d8583980f671513c7d4d532260bce5b86667).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206701004
  
    **[Test build #55185 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55185/consoleFull)** for PR 9565 at commit [`7a046fa`](https://github.com/apache/spark/commit/7a046fa3864e8e49860205e2ed47e65de99acb4c).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162750004
  
    **[Test build #47304 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47304/consoleFull)** for PR 9565 at commit [`26b4d85`](https://github.com/apache/spark/commit/26b4d8583980f671513c7d4d532260bce5b86667).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155717830
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45621/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155409445
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9565#discussion_r46785072
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/ScalaUDF.scala ---
    @@ -66,980 +85,305 @@ case class ScalaUDF(
     
       */
     
    -  // Accessors used in genCode
    -  def userDefinedFunc(): AnyRef = function
    -  def getChildren(): Seq[Expression] = children
    -
    -  private[this] val f = children.size match {
    -    case 0 =>
    -      val func = function.asInstanceOf[() => Any]
    -      (input: InternalRow) => {
    -        func()
    -      }
    -
    -    case 1 =>
    -      val func = function.asInstanceOf[(Any) => Any]
    -      val child0 = children(0)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)))
    -      }
    -
    -    case 2 =>
    -      val func = function.asInstanceOf[(Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)))
    -      }
    -
    -    case 3 =>
    -      val func = function.asInstanceOf[(Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)))
    -      }
    -
    -    case 4 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)))
    -      }
    -
    -    case 5 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)))
    -      }
    -
    -    case 6 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)))
    -      }
    -
    -    case 7 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)))
    -      }
    -
    -    case 8 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)))
    -      }
    -
    -    case 9 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)))
    -      }
    -
    -    case 10 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)))
    -      }
    -
    -    case 11 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)))
    -      }
    -
    -    case 12 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)))
    -      }
    -
    -    case 13 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)))
    -      }
    -
    -    case 14 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)))
    -      }
    -
    -    case 15 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)))
    -      }
    -
    -    case 16 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)))
    -      }
    -
    -    case 17 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)))
    -      }
    -
    -    case 18 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)))
    -      }
    -
    -    case 19 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)))
    -      }
    -
    -    case 20 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)))
    -      }
    -
    -    case 21 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)))
    -      }
    -
    -    case 22 =>
    -      val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    -      val child0 = children(0)
    -      val child1 = children(1)
    -      val child2 = children(2)
    -      val child3 = children(3)
    -      val child4 = children(4)
    -      val child5 = children(5)
    -      val child6 = children(6)
    -      val child7 = children(7)
    -      val child8 = children(8)
    -      val child9 = children(9)
    -      val child10 = children(10)
    -      val child11 = children(11)
    -      val child12 = children(12)
    -      val child13 = children(13)
    -      val child14 = children(14)
    -      val child15 = children(15)
    -      val child16 = children(16)
    -      val child17 = children(17)
    -      val child18 = children(18)
    -      val child19 = children(19)
    -      val child20 = children(20)
    -      val child21 = children(21)
    -      lazy val converter0 = CatalystTypeConverters.createToScalaConverter(child0.dataType)
    -      lazy val converter1 = CatalystTypeConverters.createToScalaConverter(child1.dataType)
    -      lazy val converter2 = CatalystTypeConverters.createToScalaConverter(child2.dataType)
    -      lazy val converter3 = CatalystTypeConverters.createToScalaConverter(child3.dataType)
    -      lazy val converter4 = CatalystTypeConverters.createToScalaConverter(child4.dataType)
    -      lazy val converter5 = CatalystTypeConverters.createToScalaConverter(child5.dataType)
    -      lazy val converter6 = CatalystTypeConverters.createToScalaConverter(child6.dataType)
    -      lazy val converter7 = CatalystTypeConverters.createToScalaConverter(child7.dataType)
    -      lazy val converter8 = CatalystTypeConverters.createToScalaConverter(child8.dataType)
    -      lazy val converter9 = CatalystTypeConverters.createToScalaConverter(child9.dataType)
    -      lazy val converter10 = CatalystTypeConverters.createToScalaConverter(child10.dataType)
    -      lazy val converter11 = CatalystTypeConverters.createToScalaConverter(child11.dataType)
    -      lazy val converter12 = CatalystTypeConverters.createToScalaConverter(child12.dataType)
    -      lazy val converter13 = CatalystTypeConverters.createToScalaConverter(child13.dataType)
    -      lazy val converter14 = CatalystTypeConverters.createToScalaConverter(child14.dataType)
    -      lazy val converter15 = CatalystTypeConverters.createToScalaConverter(child15.dataType)
    -      lazy val converter16 = CatalystTypeConverters.createToScalaConverter(child16.dataType)
    -      lazy val converter17 = CatalystTypeConverters.createToScalaConverter(child17.dataType)
    -      lazy val converter18 = CatalystTypeConverters.createToScalaConverter(child18.dataType)
    -      lazy val converter19 = CatalystTypeConverters.createToScalaConverter(child19.dataType)
    -      lazy val converter20 = CatalystTypeConverters.createToScalaConverter(child20.dataType)
    -      lazy val converter21 = CatalystTypeConverters.createToScalaConverter(child21.dataType)
    -      (input: InternalRow) => {
    -        func(
    -          converter0(child0.eval(input)),
    -          converter1(child1.eval(input)),
    -          converter2(child2.eval(input)),
    -          converter3(child3.eval(input)),
    -          converter4(child4.eval(input)),
    -          converter5(child5.eval(input)),
    -          converter6(child6.eval(input)),
    -          converter7(child7.eval(input)),
    -          converter8(child8.eval(input)),
    -          converter9(child9.eval(input)),
    -          converter10(child10.eval(input)),
    -          converter11(child11.eval(input)),
    -          converter12(child12.eval(input)),
    -          converter13(child13.eval(input)),
    -          converter14(child14.eval(input)),
    -          converter15(child15.eval(input)),
    -          converter16(child16.eval(input)),
    -          converter17(child17.eval(input)),
    -          converter18(child18.eval(input)),
    -          converter19(child19.eval(input)),
    -          converter20(child20.eval(input)),
    -          converter21(child21.eval(input)))
    -      }
    +  private[this] val f = {
    +    lazy val inputEncoder: ExpressionEncoder[Row] = RowEncoder(inputSchema)
    +    children.size match {
    +      case 0 =>
    +        val func = function.asInstanceOf[() => Any]
    +        (input: InternalRow) => {
    +          func()
    +        }
    +
    +      case 1 =>
    +        val func = function.asInstanceOf[(Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0))
    +        }
    +
    +      case 2 =>
    +        val func = function.asInstanceOf[(Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1))
    +        }
    +
    +      case 3 =>
    +        val func = function.asInstanceOf[(Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2))
    +        }
    +
    +      case 4 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3))
    +        }
    +
    +      case 5 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4))
    +        }
    +
    +      case 6 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5))
    +        }
    +
    +      case 7 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6))
    +        }
    +
    +      case 8 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7))
    +        }
    +
    +      case 9 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8))
    +        }
    +
    +      case 10 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9))
    +        }
    +
    +      case 11 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10))
    +        }
    +
    +      case 12 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11))
    +        }
    +
    +      case 13 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12))
    +        }
    +
    +      case 14 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13))
    +        }
    +
    +      case 15 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14))
    +        }
    +
    +      case 16 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15))
    +        }
    +
    +      case 17 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16))
    +        }
    +
    +      case 18 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17))
    +        }
    +
    +      case 19 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18))
    +        }
    +
    +      case 20 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19))
    +        }
    +
    +      case 21 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20))
    +        }
    +
    +      case 22 =>
    +        val func = function.asInstanceOf[(Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any, Any) => Any]
    +        (input: InternalRow) => {
    +          val convertedRow: Row = inputEncoder.fromRow(input)
    +          func(convertedRow.get(0), convertedRow.get(1), convertedRow.get(2), convertedRow.get(3),
    +            convertedRow.get(4), convertedRow.get(5), convertedRow.get(6), convertedRow.get(7),
    +            convertedRow.get(8), convertedRow.get(9), convertedRow.get(10), convertedRow.get(11),
    +            convertedRow.get(12), convertedRow.get(13), convertedRow.get(14), convertedRow.get(15),
    +            convertedRow.get(16), convertedRow.get(17), convertedRow.get(18), convertedRow.get(19),
    +            convertedRow.get(20), convertedRow.get(21))
    +        }
    +    }
       }
     
       // scalastyle:on
     
    -  // Generate codes used to convert the arguments to Scala type for user-defined funtions
    -  private[this] def genCodeForConverter(ctx: CodeGenContext, index: Int): String = {
    -    val converterClassName = classOf[Any => Any].getName
    -    val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
    -    val expressionClassName = classOf[Expression].getName
    -    val scalaUDFClassName = classOf[ScalaUDF].getName
    -
    -    val converterTerm = ctx.freshName("converter")
    -    val expressionIdx = ctx.references.size - 1
    -    ctx.addMutableState(converterClassName, converterTerm,
    -      s"this.$converterTerm = ($converterClassName)$typeConvertersClassName" +
    -        s".createToScalaConverter(((${expressionClassName})((($scalaUDFClassName)" +
    -          s"expressions[$expressionIdx]).getChildren().apply($index))).dataType());")
    -    converterTerm
    -  }
    -
       override def genCode(
           ctx: CodeGenContext,
           ev: GeneratedExpressionCode): String = {
     
         ctx.references += this
    +    val scalaUDFTermIdx = ctx.references.size - 1
     
         val scalaUDFClassName = classOf[ScalaUDF].getName
         val converterClassName = classOf[Any => Any].getName
         val typeConvertersClassName = CatalystTypeConverters.getClass.getName + ".MODULE$"
         val expressionClassName = classOf[Expression].getName
    -
    -    // Generate codes used to convert the returned value of user-defined functions to Catalyst type
    -    val catalystConverterTerm = ctx.freshName("catalystConverter")
    -    val catalystConverterTermIdx = ctx.references.size - 1
    -    ctx.addMutableState(converterClassName, catalystConverterTerm,
    -      s"this.$catalystConverterTerm = ($converterClassName)$typeConvertersClassName" +
    -        s".createToCatalystConverter((($scalaUDFClassName)expressions" +
    -          s"[$catalystConverterTermIdx]).dataType());")
    +    val expressionEncoderClassName = classOf[ExpressionEncoder[Row]].getName
    +    val rowEncoderClassName = RowEncoder.getClass.getName + ".MODULE$"
    +    val structTypeClassName = StructType.getClass.getName + ".MODULE$"
    +    val rowClassName = Row.getClass.getName + ".MODULE$"
    +    val rowClass = classOf[Row].getName
    +    val internalRowClassName = classOf[InternalRow].getName
    +    // scalastyle:off
    +    val javaConversionClassName = scala.collection.JavaConversions.getClass.getName + ".MODULE$"
    +    // scalastyle:on
    +
    +    // Generate code for input encoder
    +    val inputExpressionEncoderTerm = ctx.freshName("inputExpressionEncoder")
    +    ctx.addMutableState(expressionEncoderClassName, inputExpressionEncoderTerm,
    +      s"this.$inputExpressionEncoderTerm = ($expressionEncoderClassName)$rowEncoderClassName" +
    +        s".apply((($scalaUDFClassName)expressions" +
    +          s"[$scalaUDFTermIdx]).getInputSchema());")
    +
    +    // Generate code for output encoder
    +    val outputExpressionEncoderTerm = ctx.freshName("outputExpressionEncoder")
    +    ctx.addMutableState(expressionEncoderClassName, outputExpressionEncoderTerm,
    +      s"this.$outputExpressionEncoderTerm = ($expressionEncoderClassName)$rowEncoderClassName" +
    +        s".apply((($scalaUDFClassName)expressions[$scalaUDFTermIdx]).getDataType());")
     
         val resultTerm = ctx.freshName("result")
     
    -    // This must be called before children expressions' codegen
    -    // because ctx.references is used in genCodeForConverter
    -    val converterTerms = (0 until children.size).map(genCodeForConverter(ctx, _))
    -
         // Initialize user-defined function
         val funcClassName = s"scala.Function${children.size}"
     
         val funcTerm = ctx.freshName("udf")
    -    val funcExpressionIdx = ctx.references.size - 1
         ctx.addMutableState(funcClassName, funcTerm,
           s"this.$funcTerm = ($funcClassName)((($scalaUDFClassName)expressions" +
    -        s"[$funcExpressionIdx]).userDefinedFunc());")
    +        s"[$scalaUDFTermIdx]).userDefinedFunc());")
     
         // codegen for children expressions
         val evals = children.map(_.gen(ctx))
    +    val evalsArgs = evals.map(_.value).mkString(", ")
    +    val evalsAsSeq = s"$javaConversionClassName.asScalaIterable" +
    +      s"(java.util.Arrays.asList($evalsArgs)).toList()"
    +    val inputInternalRowTerm = ctx.freshName("inputRow")
    +    val inputInternalRow = s"$rowClass $inputInternalRowTerm = " +
    +      s"($rowClass)$inputExpressionEncoderTerm.fromRow(InternalRow.fromSeq($evalsAsSeq));"
     
         // Generate the codes for expressions and calling user-defined function
         // We need to get the boxedType of dataType's javaType here. Because for the dataType
         // such as IntegerType, its javaType is `int` and the returned type of user-defined
         // function is Object. Trying to convert an Object to `int` will cause casting exception.
         val evalCode = evals.map(_.code).mkString
    -    val funcArguments = converterTerms.zip(evals).map {
    -      case (converter, eval) => s"$converter.apply(${eval.value})"
    -    }.mkString(",")
    +
    +    val funcArguments = (0 until children.size).map { i =>
    +      s"$inputInternalRowTerm.get($i)"
    +    }.mkString(", ")
    +
    +    val rowParametersTerm = ctx.freshName("rowParameters")
    +    val innerRow = s"$rowClass $rowParametersTerm = $rowClassName.apply(" +
    +      s"$javaConversionClassName.asScalaIterable" +
    +      s"(java.util.Arrays.asList($funcTerm.apply($funcArguments))).toList());"
    +    val internalRowTerm = ctx.freshName("internalRow")
    +    val internalRow = s"$internalRowClassName $internalRowTerm = ($internalRowClassName)" +
    +      s"${outputExpressionEncoderTerm}.toRow($rowParametersTerm).copy();"
    --- End diff --
    
    Hmm, basically Encoder does in `toRow` is to project input row to internal row using `GenerateUnsafeProjection`. I think we can't just embed the generated code here. How do you think?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155703006
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155797577
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45628/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207234020
  
    **[Test build #55325 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55325/consoleFull)** for PR 9565 at commit [`2a0c319`](https://github.com/apache/spark/commit/2a0c3192e1e7872a800913fe42e4b02ed622b37c).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206813056
  
    **[Test build #55218 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55218/consoleFull)** for PR 9565 at commit [`597c971`](https://github.com/apache/spark/commit/597c9713f1262d18b853b5876614a658b3bb61d3).
     * This patch **fails RAT tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-208138899
  
    **[Test build #55502 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55502/consoleFull)** for PR 9565 at commit [`30a867e`](https://github.com/apache/spark/commit/30a867e5a1910d9f5fbf43924556aee2d0694bbb).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206980269
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55226/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206788482
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55210/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207195538
  
    **[Test build #55266 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55266/consoleFull)** for PR 9565 at commit [`648c7b2`](https://github.com/apache/spark/commit/648c7b22dbea6de9b4a57fab87941bb8cac268bb).
     * This patch **fails from timeout after a configured wait of \`250m\`**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206706622
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55185/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206943194
  
    **[Test build #55226 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55226/consoleFull)** for PR 9565 at commit [`405e8b0`](https://github.com/apache/spark/commit/405e8b048f475bef2893f3be76b43516ab829954).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155245604
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-156840447
  
    **[Test build #45954 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45954/consoleFull)** for PR 9565 at commit [`5c18c0c`](https://github.com/apache/spark/commit/5c18c0c6a3b1d65ee7ed81f54f816796db63d394).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204416684
  
    **[Test build #54703 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54703/consoleFull)** for PR 9565 at commit [`2fcbe69`](https://github.com/apache/spark/commit/2fcbe69b7b626201736a16867f3c1feefc834ccf).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155809846
  
    **[Test build #45641 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45641/consoleFull)** for PR 9565 at commit [`1234515`](https://github.com/apache/spark/commit/12345150cff5c02780e0055600b584a0aeaaf441).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206822344
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162569257
  
    weird. these tests are passed on my server.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155709584
  
    **[Test build #45621 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45621/consoleFull)** for PR 9565 at commit [`1234515`](https://github.com/apache/spark/commit/12345150cff5c02780e0055600b584a0aeaaf441).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-156396133
  
    ping @davies 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155330331
  
    @davies Thanks for reviewing. I will work on generated version later.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206696183
  
    Looks like the runtime mirror can't work well with using member method as udf.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-156822427
  
    **[Test build #45954 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45954/consoleFull)** for PR 9565 at commit [`5c18c0c`](https://github.com/apache/spark/commit/5c18c0c6a3b1d65ee7ed81f54f816796db63d394).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-204431208
  
    **[Test build #54703 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54703/consoleFull)** for PR 9565 at commit [`2fcbe69`](https://github.com/apache/spark/commit/2fcbe69b7b626201736a16867f3c1feefc834ccf).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206980268
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-207262918
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/55301/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-211755597
  
    Close this now. Maybe revisit this in the future.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162756518
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/47304/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155245715
  
     Merged build triggered.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155254796
  
    **[Test build #45449 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45449/consoleFull)** for PR 9565 at commit [`1e13ff9`](https://github.com/apache/spark/commit/1e13ff98d1458b78c5cb8cb187bad4d91dc1143e).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-211760253
  
    When using member method as udf., for example,  `def createTransformFunc` in `org.apache.spark.ml.Transformer`, jenkins tests always get an exception.
    
    Otherwise, it works well.
    
    BTW, I can't reproduce that exception locally. Maybe java version matters.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-206805301
  
    **[Test build #55217 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/55217/consoleFull)** for PR 9565 at commit [`597c971`](https://github.com/apache/spark/commit/597c9713f1262d18b853b5876614a658b3bb61d3).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-162851617
  
    **[Test build #47330 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/47330/consoleFull)** for PR 9565 at commit [`1ca2efc`](https://github.com/apache/spark/commit/1ca2efcaa2eb511a398b5db67dee5f340a524de2).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155697990
  
    **[Test build #45616 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45616/consoleFull)** for PR 9565 at commit [`39c0b7a`](https://github.com/apache/spark/commit/39c0b7ad805b04235755b0d339635dae0e7ffbec).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:\n  * `sealed abstract class State[S] `\n  * `sealed abstract class StateSpec[KeyType, ValueType, StateType, EmittedType] extends Serializable `\n  * `case class StateSpecImpl[K, V, S, T](`\n  * `sealed abstract class TrackStateDStream[KeyType, ValueType, StateType, EmittedType: ClassTag](`\n  * `class InternalTrackStateDStream[K: ClassTag, V: ClassTag, S: ClassTag, E: ClassTag](`\n  * `  case class StateInfo[S](`\n  * `  class LimitMarker(val num: Int) extends Serializable`\n


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by viirya <gi...@git.apache.org>.
Github user viirya commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155804653
  
    retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155388623
  
    Merged build started.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155442194
  
    **[Test build #45520 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/45520/consoleFull)** for PR 9565 at commit [`ecf01bf`](https://github.com/apache/spark/commit/ecf01bfb070644337681631c08408e0610a433e8).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-155442397
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/45520/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-11593][SQL] Replace catalyst converter ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/9565#issuecomment-208147147
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org