You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by lianhuiwang <gi...@git.apache.org> on 2016/06/16 15:17:07 UTC

[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: CREAT...

GitHub user lianhuiwang opened a pull request:

    https://github.com/apache/spark/pull/13706

    [SPARK-15988] [SQL] Implement DDL commands: CREATE/DROP TEMPORARY MACRO

    ## What changes were proposed in this pull request?
    In https://issues.apache.org/jira/browse/HIVE-2655, Hive have implemented CREATE/DROP TEMPORARY MACRO. So This PR adds the support of native DDL commands to create and drop temporary macro for SparkSQL.
    temporary macro can be considered as a special temporary function that has a FunctionBuilder that transform macro function into expression that can be directly calculated.
    
    ## How was this patch tested?
    add unit tests


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/lianhuiwang/spark macro

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/13706.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #13706
    
----
commit f2433c2b6769bb7fcaf40b6e74a1eac3589aae82
Author: Lianhui Wang <li...@gmail.com>
Date:   2016-06-16T15:08:14Z

    init commit

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67545974
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,94 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis.{FunctionRegistry, UnresolvedAttribute}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * This class provides arguments and body expression of the macro.
    + */
    +case class MacroFunctionWrapper(arguments: Seq[AttributeReference], body: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(macroName: String, macroFunction: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val inputSet = AttributeSet(macroFunction.arguments)
    +    val colNames = macroFunction.arguments.map(_.name)
    --- End diff --
    
    We should check that all parameter names are unique.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77533/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77467 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77467/testReport)** for PR 13706 at commit [`b52698f`](https://github.com/apache/spark/commit/b52698ffdb5b5c809f749b7793f276bb9b305a0c).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/68487/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118989143
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala ---
    @@ -1090,6 +1090,24 @@ class SessionCatalog(
         }
       }
     
    +  /** Create a temporary macro. */
    +  def createTempMacro(
    +      name: String,
    +      info: ExpressionInfo,
    +      functionBuilder: FunctionBuilder): Unit = {
    +    if (functionRegistry.functionExists(name)) {
    --- End diff --
    
    ```
    hive> create temporary macro max(x int)
        > x*x;
    OK
    Time taken: 0.014 seconds
    
    hive> select max(3) from t1;
    OK
    9
    Time taken: 0.468 seconds, Fetched: 1 row(s)
    
    hive> select max(3,4) from t1;
    FAILED: SemanticException [Error 10015]: Line 1:7 Arguments length mismatch '4': The macro max accepts exactly 1 arguments.
    ```
    
    Hive overwrites the temporary function



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77544 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77544/testReport)** for PR 13706 at commit [`4d8e843`](https://github.com/apache/spark/commit/4d8e843fb490845b8e5b55033ccac9bba93b7591).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68486 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68486/consoleFull)** for PR 13706 at commit [`9fe1881`](https://github.com/apache/spark/commit/9fe1881ffe2810ec445f0560a7920c167c22b2d7).
     * This patch **fails to build**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77472 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77472/testReport)** for PR 13706 at commit [`4ee32e9`](https://github.com/apache/spark/commit/4ee32e95e61015fe608884d3096ac82a781dd767).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118845133
  
    --- Diff: sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala ---
    @@ -1516,6 +1516,35 @@ abstract class DDLSuite extends QueryTest with SQLTestUtils {
         )
       }
     
    +  test("create/drop temporary macro") {
    --- End diff --
    
    Should we also test a combination of temporary macros/functions...?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77533 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77533/testReport)** for PR 13706 at commit [`1563f12`](https://github.com/apache/spark/commit/1563f12d78a9c32bf4bed69cb9f86a7d00eb18ef).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `case class MacroFunctionWrapper(columns: StructType, macroFunction: Expression)`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60654 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60654/consoleFull)** for PR 13706 at commit [`0b93636`](https://github.com/apache/spark/commit/0b93636c941fd3093ba9b93e49e75211aa077c90).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @hvanhovell  Can you take a look at it? Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60780 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60780/consoleFull)** for PR 13706 at commit [`301e950`](https://github.com/apache/spark/commit/301e9508d860283ea8d12f91f52d04d9d697b000).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `class NoSuchTempMacroException(func: String)`
      * `case class CreateMacroCommand(`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/60642/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60781 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60781/consoleFull)** for PR 13706 at commit [`808a5fa`](https://github.com/apache/spark/commit/808a5fa509392d8e2b909020f4a711c4dc2437b5).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/75958/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/60780/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77544/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77467 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77467/testReport)** for PR 13706 at commit [`b52698f`](https://github.com/apache/spark/commit/b52698ffdb5b5c809f749b7793f276bb9b305a0c).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77468 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77468/testReport)** for PR 13706 at commit [`fce1121`](https://github.com/apache/spark/commit/fce112147449278f0078321306f1a1fe34ab938b).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r119124106
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/NoSuchItemException.scala ---
    @@ -52,3 +52,6 @@ class NoSuchPartitionsException(db: String, table: String, specs: Seq[TableParti
     
     class NoSuchTempFunctionException(func: String)
       extends AnalysisException(s"Temporary function '$func' not found")
    +
    +class NoSuchTempMacroException(func: String)
    --- End diff --
    
    Yes, Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844274
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = columns.map(_.name).zipWithIndex.toMap
    +    if (colToIndex.size != columns.size) {
    +      throw new AnalysisException(s"Cannot support duplicate colNames " +
    +        s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}")
    +    }
    +    val macroFunction = funcWrapper.macroFunction.transform {
    +      case u: UnresolvedAttribute =>
    +        val index = colToIndex.get(u.name).getOrElse(
    +          throw new AnalysisException(s"Cannot find colName: ${u} " +
    +            s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}"))
    +        BoundReference(index, columns(index).dataType, columns(index).nullable)
    +      case u: UnresolvedFunction =>
    +        sparkSession.sessionState.catalog.lookupFunction(u.name, u.children)
    +      case s: SubqueryExpression =>
    +        throw new AnalysisException(s"Cannot support Subquery: ${s} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +      case u: UnresolvedGenerator =>
    --- End diff --
    
    Is this what Hive does? I really don't see why we should not support this.
    
    Please note that we cannot use generators if we decide that an expression has to be a fully resolved expression at creation time.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67596616
  
    --- Diff: sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 ---
    @@ -97,6 +97,9 @@ statement
         | CREATE TEMPORARY? FUNCTION qualifiedName AS className=STRING
             (USING resource (',' resource)*)?                              #createFunction
         | DROP TEMPORARY? FUNCTION (IF EXISTS)? qualifiedName              #dropFunction
    +    | CREATE TEMPORARY MACRO macroName=identifier
    --- End diff --
    
    No, Now Hive only support temporary macro's.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844209
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    --- End diff --
    
    Why do we need this?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: CREATE/DROP ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60642 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60642/consoleFull)** for PR 13706 at commit [`f2433c2`](https://github.com/apache/spark/commit/f2433c2b6769bb7fcaf40b6e74a1eac3589aae82).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844406
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    --- End diff --
    
    It might easier to use `StructType().toAttributes` here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844341
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    --- End diff --
    
    Nit: put `}` on a new line


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Let me first clean up the existing function registry https://github.com/apache/spark/pull/18142. Will ping you when it is ready. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77466 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77466/testReport)** for PR 13706 at commit [`ad85109`](https://github.com/apache/spark/commit/ad851098de14a105846f9f060d0e3a3b26df266a).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77461 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77461/testReport)** for PR 13706 at commit [`314913d`](https://github.com/apache/spark/commit/314913df4a0345957c718622ab1924901a895b90).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77460 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77460/testReport)** for PR 13706 at commit [`277ba9f`](https://github.com/apache/spark/commit/277ba9fc64d94b9245574d5625109dbc3383bae9).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder `


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @gatorsmile sorry for reply lately. Now i have merged with master. Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @hvanhovell I have addressed your comments. Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r119122622
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    --- End diff --
    
    because Analyzer will check macroFunction that is invalid if I donot use MacroFunctionWrapper.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r119124043
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala ---
    @@ -1090,6 +1090,24 @@ class SessionCatalog(
         }
       }
     
    +  /** Create a temporary macro. */
    +  def createTempMacro(
    +      name: String,
    +      info: ExpressionInfo,
    +      functionBuilder: FunctionBuilder): Unit = {
    +    if (functionRegistry.functionExists(name)) {
    +      throw new AnalysisException(s"Function $name already exists")
    +    }
    +    functionRegistry.registerFunction(name, info, functionBuilder)
    +  }
    +
    +  /** Drop a temporary macro. */
    +  def dropTempMacro(name: String, ignoreIfNotExists: Boolean): Unit = {
    +    if (!functionRegistry.dropMacro(name) && !ignoreIfNotExists) {
    +      throw new NoSuchTempMacroException(name)
    --- End diff --
    
    Yes, I have update it with this case.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60780 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60780/consoleFull)** for PR 13706 at commit [`301e950`](https://github.com/apache/spark/commit/301e9508d860283ea8d12f91f52d04d9d697b000).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60642 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60642/consoleFull)** for PR 13706 at commit [`f2433c2`](https://github.com/apache/spark/commit/f2433c2b6769bb7fcaf40b6e74a1eac3589aae82).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `case class MacroFunctionWrapper(arguments: Seq[AttributeReference], body: Expression)`
      * `case class CreateMacroCommand(macroName: String, macroFunction: MacroFunctionWrapper)`
      * `case class DropMacroCommand(macroName: String, ifExists: Boolean)`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68505 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68505/consoleFull)** for PR 13706 at commit [`fb8b57a`](https://github.com/apache/spark/commit/fb8b57a4d46f6856dc2c883c6e995c248dda6a3b).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68487 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68487/consoleFull)** for PR 13706 at commit [`b8ffdc9`](https://github.com/apache/spark/commit/b8ffdc9d9f021e4fcec396a7bff5703a6e3ed521).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77460/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by lshmouse <gi...@git.apache.org>.
Github user lshmouse commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @lianhuiwang 
    I think the problem is that no need to check if macroFunction is resolved. 
    Data type may be cast dynamically according the sql data type.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @hvanhovell I have updated this PR. Can you take a look? Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77469 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77469/testReport)** for PR 13706 at commit [`eaff4e9`](https://github.com/apache/spark/commit/eaff4e966e07dd0df36ff85952462d68cb9474f9).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/68486/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60841 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60841/consoleFull)** for PR 13706 at commit [`af0136d`](https://github.com/apache/spark/commit/af0136de2931aa390c4c83229622b53769952de3).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60841 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60841/consoleFull)** for PR 13706 at commit [`af0136d`](https://github.com/apache/spark/commit/af0136de2931aa390c4c83229622b53769952de3).
     * This patch passes all tests.
     * This patch **does not merge cleanly**.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118845638
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala ---
    @@ -1090,6 +1090,24 @@ class SessionCatalog(
         }
       }
     
    +  /** Create a temporary macro. */
    +  def createTempMacro(
    +      name: String,
    +      info: ExpressionInfo,
    +      functionBuilder: FunctionBuilder): Unit = {
    +    if (functionRegistry.functionExists(name)) {
    --- End diff --
    
    I am not entirely sure if we should throw an exception here. It unfortunately depends on the semantics you follow, SQL will throw an exception, whereas the Dataframe API will just overwrite the function. Let's follow Hive for now.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77469 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77469/testReport)** for PR 13706 at commit [`eaff4e9`](https://github.com/apache/spark/commit/eaff4e966e07dd0df36ff85952462d68cb9474f9).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77461 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77461/testReport)** for PR 13706 at commit [`314913d`](https://github.com/apache/spark/commit/314913df4a0345957c718622ab1924901a895b90).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder(conf) `


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r119123747
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala ---
    @@ -107,6 +110,14 @@ class SimpleFunctionRegistry extends FunctionRegistry {
         functionBuilders.remove(name).isDefined
       }
     
    +  override def dropMacro(name: String): Boolean = synchronized {
    --- End diff --
    
    Hive can drop temporary function using command 'drop Macro'. And it also can  drop temporary macro using command 'drop temporary function'.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77470/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77470 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77470/testReport)** for PR 13706 at commit [`97632a9`](https://github.com/apache/spark/commit/97632a9a3dab1322929c9011005fe1422e1cd748).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/68485/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #75958 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/75958/consoleFull)** for PR 13706 at commit [`e895a9c`](https://github.com/apache/spark/commit/e895a9c7b89d2a53f6747f1e7fa08f8e97b80ed4).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460540
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,94 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis.{FunctionRegistry, UnresolvedAttribute}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * This class provides arguments and body expression of the macro.
    + */
    +case class MacroFunctionWrapper(arguments: Seq[AttributeReference], body: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(macroName: String, macroFunction: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val inputSet = AttributeSet(macroFunction.arguments)
    +    val colNames = macroFunction.arguments.map(_.name)
    +    val colToIndex: Map[String, Int] = colNames.zipWithIndex.toMap
    +    macroFunction.body.transformUp {
    +      case u @ UnresolvedAttribute(nameParts) =>
    +        assert(nameParts.length == 1)
    +        colToIndex.get(nameParts.head).getOrElse(
    +          throw new AnalysisException(s"Cannot create temporary macro '$macroName', " +
    +            s"cannot resolve: [${u}] given input columns: [${inputSet}]"))
    +        u
    +      case _: SubqueryExpression =>
    +        throw new AnalysisException(s"Cannot create temporary macro '$macroName', " +
    +          s"cannot support subquery for macro.")
    +    }
    +
    +    val macroInfo = macroFunction.arguments.mkString(",") + "->" + macroFunction.body.toString
    +    val info = new ExpressionInfo(macroInfo, macroName)
    +    val builder = (children: Seq[Expression]) => {
    +      if (children.size != colNames.size) {
    +        throw new AnalysisException(s"actual number of arguments: ${children.size} != " +
    +          s"expected number of arguments: ${colNames.size} for Macro $macroName")
    +      }
    +      macroFunction.body.transformUp {
    +        case u @ UnresolvedAttribute(nameParts) =>
    --- End diff --
    
    A `BoundReference` would allow you to directly get the index. See my previous comment.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77472/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77463 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77463/testReport)** for PR 13706 at commit [`3d05e4f`](https://github.com/apache/spark/commit/3d05e4f3509d32fa85618bfb475b648261a0694f).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844463
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = columns.map(_.name).zipWithIndex.toMap
    +    if (colToIndex.size != columns.size) {
    +      throw new AnalysisException(s"Cannot support duplicate colNames " +
    +        s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}")
    +    }
    +    val macroFunction = funcWrapper.macroFunction.transform {
    +      case u: UnresolvedAttribute =>
    +        val index = colToIndex.get(u.name).getOrElse(
    +          throw new AnalysisException(s"Cannot find colName: ${u} " +
    +            s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}"))
    +        BoundReference(index, columns(index).dataType, columns(index).nullable)
    +      case u: UnresolvedFunction =>
    +        sparkSession.sessionState.catalog.lookupFunction(u.name, u.children)
    +      case s: SubqueryExpression =>
    +        throw new AnalysisException(s"Cannot support Subquery: ${s} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +      case u: UnresolvedGenerator =>
    +        throw new AnalysisException(s"Cannot support Generator: ${u} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +    }
    +
    +    val macroInfo = columns.mkString(",") + " -> " + funcWrapper.macroFunction.toString
    +    val info = new ExpressionInfo(macroInfo, macroName, true)
    +    val builder = (children: Seq[Expression]) => {
    +      if (children.size != columns.size) {
    --- End diff --
    
    It is slightly better to `columns.size` in a separate variable, so we do not include `columns` in the closure.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118846490
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = columns.map(_.name).zipWithIndex.toMap
    +    if (colToIndex.size != columns.size) {
    +      throw new AnalysisException(s"Cannot support duplicate colNames " +
    +        s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}")
    +    }
    +    val macroFunction = funcWrapper.macroFunction.transform {
    +      case u: UnresolvedAttribute =>
    +        val index = colToIndex.get(u.name).getOrElse(
    +          throw new AnalysisException(s"Cannot find colName: ${u} " +
    +            s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}"))
    +        BoundReference(index, columns(index).dataType, columns(index).nullable)
    +      case u: UnresolvedFunction =>
    +        sparkSession.sessionState.catalog.lookupFunction(u.name, u.children)
    +      case s: SubqueryExpression =>
    +        throw new AnalysisException(s"Cannot support Subquery: ${s} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +      case u: UnresolvedGenerator =>
    +        throw new AnalysisException(s"Cannot support Generator: ${u} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +    }
    +
    +    val macroInfo = columns.mkString(",") + " -> " + funcWrapper.macroFunction.toString
    +    val info = new ExpressionInfo(macroInfo, macroName, true)
    +    val builder = (children: Seq[Expression]) => {
    +      if (children.size != columns.size) {
    +        throw new AnalysisException(s"Actual number of columns: ${children.size} != " +
    +          s"expected number of columns: ${columns.size} for Macro $macroName")
    +      }
    +      macroFunction.transform {
    +        // Skip to validate the input type because check it at runtime.
    --- End diff --
    
    How do we check at runtime? The current code does not seem to respect the types passed, and rely on the macro's expression to do some type validation, this means you can pass anything to the macro and the user can end up with an unexpected result:
    ```sql
    create macro plus(a int, b int) as a + b;
    select plus(1.0, 1.0) as result -- This returns a decimal, and not an int as expected
    ```
    So I think we should at least validate the input expressions. The hacky way would be to add casts, and have the analyzer fail if the cast cannot be made (this is terrible UX). A better way to would be to create some sentinel expression that makes sure the analyzer will insert the correct cast, and throws a relevant exception (mentioning the macro) when this blows up...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77466/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460344
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,94 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis.{FunctionRegistry, UnresolvedAttribute}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * This class provides arguments and body expression of the macro.
    + */
    +case class MacroFunctionWrapper(arguments: Seq[AttributeReference], body: Expression)
    --- End diff --
    
    Not needed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68505 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68505/consoleFull)** for PR 13706 at commit [`fb8b57a`](https://github.com/apache/spark/commit/fb8b57a4d46f6856dc2c883c6e995c248dda6a3b).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @gatorsmile OK. Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77461/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77465 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77465/testReport)** for PR 13706 at commit [`1eb23c7`](https://github.com/apache/spark/commit/1eb23c75b0be7b93980b44f0a9fbaab6a489996e).
     * This patch **fails to build**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67634511
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
    @@ -590,6 +592,53 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder {
       }
     
       /**
    +   * Create a [[CreateMacroCommand]] command.
    +   *
    +   * For example:
    +   * {{{
    +   *   CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    +   * }}}
    +   */
    +  override def visitCreateMacro(ctx: CreateMacroContext): LogicalPlan = withOrigin(ctx) {
    +    val arguments = Option(ctx.colTypeList).map(visitColTypeList(_))
    +      .getOrElse(Seq.empty[StructField]).map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = arguments.map(_.name).zipWithIndex.toMap
    --- End diff --
    
    Why i do not to move this into the CreateMacroCommand? Because analyzer.checkAnalysis() will check if  macroFunction of CreateMacroCommand is invalid. macroFunction has UnresolvedAttributes, So analyzer.checkAnalysis() will throw a unresolved exception. If it resolved UnresolvedAttributes before,  analyzer.checkAnalysis()  does not throw a exception.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67636641
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
    @@ -590,6 +592,53 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder {
       }
     
       /**
    +   * Create a [[CreateMacroCommand]] command.
    +   *
    +   * For example:
    +   * {{{
    +   *   CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    +   * }}}
    +   */
    +  override def visitCreateMacro(ctx: CreateMacroContext): LogicalPlan = withOrigin(ctx) {
    +    val arguments = Option(ctx.colTypeList).map(visitColTypeList(_))
    +      .getOrElse(Seq.empty[StructField]).map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = arguments.map(_.name).zipWithIndex.toMap
    --- End diff --
    
    Ah, I see. You could also move this code into the companion object of the `CreateMacroCommand`. That woud also work. It is just that this code isn't parser specific.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77464/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77468/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77465/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460610
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,94 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis.{FunctionRegistry, UnresolvedAttribute}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * This class provides arguments and body expression of the macro.
    + */
    +case class MacroFunctionWrapper(arguments: Seq[AttributeReference], body: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(macroName: String, macroFunction: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val inputSet = AttributeSet(macroFunction.arguments)
    +    val colNames = macroFunction.arguments.map(_.name)
    +    val colToIndex: Map[String, Int] = colNames.zipWithIndex.toMap
    +    macroFunction.body.transformUp {
    +      case u @ UnresolvedAttribute(nameParts) =>
    +        assert(nameParts.length == 1)
    +        colToIndex.get(nameParts.head).getOrElse(
    +          throw new AnalysisException(s"Cannot create temporary macro '$macroName', " +
    +            s"cannot resolve: [${u}] given input columns: [${inputSet}]"))
    --- End diff --
    
    `colNames`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460139
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
    @@ -590,6 +591,38 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder {
       }
     
       /**
    +   * Create a [[CreateMacroCommand]] command.
    +   *
    +   * For example:
    +   * {{{
    +   *   CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    +   * }}}
    +   */
    +  override def visitCreateMacro(ctx: CreateMacroContext): LogicalPlan = withOrigin(ctx) {
    +    val arguments = Option(ctx.columns).toSeq.flatMap(visitCatalogColumns).map { col =>
    --- End diff --
    
    Call `visitColTypeList` you will get a `Seq[StructField]`, that also makes it easier to construct the `AttributeReference` (you don't need to parser the datatype). 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118989451
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala ---
    @@ -1090,6 +1090,24 @@ class SessionCatalog(
         }
       }
     
    +  /** Create a temporary macro. */
    +  def createTempMacro(
    +      name: String,
    +      info: ExpressionInfo,
    +      functionBuilder: FunctionBuilder): Unit = {
    +    if (functionRegistry.functionExists(name)) {
    +      throw new AnalysisException(s"Function $name already exists")
    +    }
    +    functionRegistry.registerFunction(name, info, functionBuilder)
    +  }
    +
    +  /** Drop a temporary macro. */
    +  def dropTempMacro(name: String, ignoreIfNotExists: Boolean): Unit = {
    +    if (!functionRegistry.dropMacro(name) && !ignoreIfNotExists) {
    +      throw new NoSuchTempMacroException(name)
    --- End diff --
    
    ```
    hive>  DROP TEMPORARY MACRO max;
    OK
    Time taken: 0.01 seconds
    hive> select max(3) from t1;
    OK
    3
    ```
    
    After we drop the macro, the existing function works well. That means, we did not delete the original built-in functions. The built-in function will not be dropped by ` DROP TEMPORARY MACRO`. After we drop the macro with the same name, the original function `max` is using the original built-in function. 



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460021
  
    --- Diff: sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 ---
    @@ -97,6 +97,9 @@ statement
         | CREATE TEMPORARY? FUNCTION qualifiedName AS className=STRING
             (USING resource (',' resource)*)?                              #createFunction
         | DROP TEMPORARY? FUNCTION (IF EXISTS)? qualifiedName              #dropFunction
    +    | CREATE TEMPORARY MACRO macroName=identifier
    +        '('(columns=colTypeList)?')' expression                        #createMacro
    --- End diff --
    
    NIT: get rid of the parenthesis, you can also just use the `colTypeList?`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67633556
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
    @@ -590,6 +592,53 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder {
       }
     
       /**
    +   * Create a [[CreateMacroCommand]] command.
    +   *
    +   * For example:
    +   * {{{
    +   *   CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    +   * }}}
    +   */
    +  override def visitCreateMacro(ctx: CreateMacroContext): LogicalPlan = withOrigin(ctx) {
    +    val arguments = Option(ctx.colTypeList).map(visitColTypeList(_))
    +      .getOrElse(Seq.empty[StructField]).map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = arguments.map(_.name).zipWithIndex.toMap
    +    if (colToIndex.size != arguments.size) {
    +      throw operationNotAllowed(
    +        s"Cannot support duplicate colNames for CREATE TEMPORARY MACRO ", ctx)
    +    }
    +    val macroFunction = expression(ctx.expression).transformUp {
    --- End diff --
    
    Ditto


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118845109
  
    --- Diff: sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala ---
    @@ -1516,6 +1516,35 @@ abstract class DDLSuite extends QueryTest with SQLTestUtils {
         )
       }
     
    +  test("create/drop temporary macro") {
    --- End diff --
    
    Can you also add a case for a macro without parameters? E.g.: `create temporary macro c() as 3E9`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67636452
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,69 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    columns: Seq[AttributeReference],
    +    macroFunction: Expression)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val macroInfo = columns.mkString(",") + " -> " + macroFunction.toString
    +    val info = new ExpressionInfo(macroInfo, macroName)
    +    val builder = (children: Seq[Expression]) => {
    +      if (children.size != columns.size) {
    +        throw new AnalysisException(s"Actual number of columns: ${children.size} != " +
    +          s"expected number of columns: ${columns.size} for Macro $macroName")
    +      }
    +      macroFunction.transformUp {
    +        case b: BoundReference => children(b.ordinal)
    --- End diff --
    
    Ok that is perfect.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77470 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77470/testReport)** for PR 13706 at commit [`97632a9`](https://github.com/apache/spark/commit/97632a9a3dab1322929c9011005fe1422e1cd748).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67635961
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
    @@ -590,6 +592,53 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder {
       }
     
       /**
    +   * Create a [[CreateMacroCommand]] command.
    +   *
    +   * For example:
    +   * {{{
    +   *   CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    +   * }}}
    +   */
    +  override def visitCreateMacro(ctx: CreateMacroContext): LogicalPlan = withOrigin(ctx) {
    +    val arguments = Option(ctx.colTypeList).map(visitColTypeList(_))
    +      .getOrElse(Seq.empty[StructField]).map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = arguments.map(_.name).zipWithIndex.toMap
    --- End diff --
    
    @hvanhovell So i think i will create a new Wrapper class to avoid unresolved exception in order to DataFrame can reuse this feature later.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r119122834
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    --- End diff --
    
    yes, i will do it, thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460632
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,94 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis.{FunctionRegistry, UnresolvedAttribute}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * This class provides arguments and body expression of the macro.
    + */
    +case class MacroFunctionWrapper(arguments: Seq[AttributeReference], body: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(macroName: String, macroFunction: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val inputSet = AttributeSet(macroFunction.arguments)
    --- End diff --
    
    not needed


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67634631
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,69 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    columns: Seq[AttributeReference],
    +    macroFunction: Expression)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val macroInfo = columns.mkString(",") + " -> " + macroFunction.toString
    +    val info = new ExpressionInfo(macroInfo, macroName)
    +    val builder = (children: Seq[Expression]) => {
    +      if (children.size != columns.size) {
    +        throw new AnalysisException(s"Actual number of columns: ${children.size} != " +
    +          s"expected number of columns: ${columns.size} for Macro $macroName")
    +      }
    +      macroFunction.transformUp {
    +        case b: BoundReference => children(b.ordinal)
    --- End diff --
    
    We do not validate the input type here. This would be entirely fine if macro arguments were defined without a `DataType`. I am not sure what we need to do here though. We have two options:
    - Ignore the DataType and rely on the expressions `inputTypes` to get casting done. This must be documented though. 
    - Introduce casts to make sure the input conforms to the required input.
    
    What do you think?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/60781/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77469/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844357
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = columns.map(_.name).zipWithIndex.toMap
    +    if (colToIndex.size != columns.size) {
    +      throw new AnalysisException(s"Cannot support duplicate colNames " +
    +        s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}")
    +    }
    +    val macroFunction = funcWrapper.macroFunction.transform {
    +      case u: UnresolvedAttribute =>
    +        val index = colToIndex.get(u.name).getOrElse(
    --- End diff --
    
    We should respect the case-sensitivity settings here. So a lookup might not be the best idea.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/60840/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67633550
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
    @@ -590,6 +592,53 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder {
       }
     
       /**
    +   * Create a [[CreateMacroCommand]] command.
    +   *
    +   * For example:
    +   * {{{
    +   *   CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    +   * }}}
    +   */
    +  override def visitCreateMacro(ctx: CreateMacroContext): LogicalPlan = withOrigin(ctx) {
    +    val arguments = Option(ctx.colTypeList).map(visitColTypeList(_))
    +      .getOrElse(Seq.empty[StructField]).map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = arguments.map(_.name).zipWithIndex.toMap
    --- End diff --
    
    Move this into the `CreateMacroCommand ` command. This would also be relevant if we were to offer a different API for creating macro's.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/68505/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by lshmouse <gi...@git.apache.org>.
Github user lshmouse commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @lianhuiwang 
    
    Just a feedback.  With this patch, creating a MACRO throws the following exception.
    Any suggestion? I am trying to debug it.
    
    ```
    16/11/30 16:59:18 INFO execution.SparkSqlParser: Parsing command: CREATE TEMPORARY MACRO flr(time_ms bigint) FLOOR(time_ms/1000/3600)*3600
    16/11/30 16:59:18 ERROR thriftserver.SparkExecuteStatementOperation: Error executing query, currentState RUNNING, 
    org.apache.spark.sql.AnalysisException: Cannot resolve '(FLOOR(((boundreference() / 1000) / 3600)) * 3600)' for CREATE TEMPORARY MACRO flr, due to data type mismatch: differing types in '(FLOOR(((boundreference() / 1000) / 3600)) * 3600)' (bigint and int).;
      at org.apache.spark.sql.execution.command.CreateMacroCommand.run(macros.scala:70)                  
      at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:60)
      at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:58)  
      at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)         
      at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:120)          
      at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:120)          
      at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:141)        
      at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)                  
      at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:138)                      
      at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:119)                           
      at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:86)         
      at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:86)                    
      at org.apache.spark.sql.Dataset.<init>(Dataset.scala:186)                                          
      at org.apache.spark.sql.Dataset.<init>(Dataset.scala:167)                                          
      at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:65)                                          
      at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)                                   
      at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:682)                                       
      at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:221)
      at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:165)
      at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1$$anon$2.run(SparkExecuteStatementOperation.scala:162)
      at java.security.AccessController.doPrivileged(Native Method)                                      
      at javax.security.auth.Subject.doAs(Subject.java:415)                                              
      at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1854)            
      at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$1.run(SparkExecuteStatementOperation.scala:175)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)                         
      at java.util.concurrent.FutureTask.run(FutureTask.java:262)                                        
      at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)                 
      at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)                 
      at java.lang.Thread.run(Thread.java:745)
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77464 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77464/testReport)** for PR 13706 at commit [`22d8b1a`](https://github.com/apache/spark/commit/22d8b1acbba87149add028c7f9f053f40ee55bb0).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @lianhuiwang Could you restart the work? Thanks!


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77464 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77464/testReport)** for PR 13706 at commit [`22d8b1a`](https://github.com/apache/spark/commit/22d8b1acbba87149add028c7f9f053f40ee55bb0).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118846647
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = columns.map(_.name).zipWithIndex.toMap
    +    if (colToIndex.size != columns.size) {
    +      throw new AnalysisException(s"Cannot support duplicate colNames " +
    +        s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}")
    +    }
    +    val macroFunction = funcWrapper.macroFunction.transform {
    +      case u: UnresolvedAttribute =>
    +        val index = colToIndex.get(u.name).getOrElse(
    +          throw new AnalysisException(s"Cannot find colName: ${u} " +
    +            s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}"))
    +        BoundReference(index, columns(index).dataType, columns(index).nullable)
    +      case u: UnresolvedFunction =>
    +        sparkSession.sessionState.catalog.lookupFunction(u.name, u.children)
    +      case s: SubqueryExpression =>
    +        throw new AnalysisException(s"Cannot support Subquery: ${s} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +      case u: UnresolvedGenerator =>
    +        throw new AnalysisException(s"Cannot support Generator: ${u} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +    }
    +
    +    val macroInfo = columns.mkString(",") + " -> " + funcWrapper.macroFunction.toString
    +    val info = new ExpressionInfo(macroInfo, macroName, true)
    +    val builder = (children: Seq[Expression]) => {
    +      if (children.size != columns.size) {
    +        throw new AnalysisException(s"Actual number of columns: ${children.size} != " +
    +          s"expected number of columns: ${columns.size} for Macro $macroName")
    +      }
    +      macroFunction.transform {
    +        // Skip to validate the input type because check it at runtime.
    --- End diff --
    
    On a related note, we are currently not sure if the macro produces a valid expression. Maybe we should run analysis on the macro expression to make sure it does not fail every query later on, e.g.:
    ```scala
    
    val resolvedMacroFunction = try {
      val plan = Project(Alias(macroFunction, "m")() :: Nil, OneRowRelation)
      val analyzed @ Project(Seq(named), OneRowRelation) =
        sparkSession.sessionState.analyzer.execute(plan)
      sparkSession.sessionState.analyzer.checkAnalysis(analyzed)
      named.children.head
    } catch {
      case a: AnalysisException =>
        ...
    }
    ```
    Note that we cannot use generators if we use this approach...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77527 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77527/testReport)** for PR 13706 at commit [`b539e94`](https://github.com/apache/spark/commit/b539e94eae58847c9da13a3cb94932b17ea2fc6e).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `class SystemFunctionRegistry(builtin: SimpleFunctionRegistry) extends SimpleFunctionRegistry `


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77467/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68487 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68487/consoleFull)** for PR 13706 at commit [`b8ffdc9`](https://github.com/apache/spark/commit/b8ffdc9d9f021e4fcec396a7bff5703a6e3ed521).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77460 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77460/testReport)** for PR 13706 at commit [`277ba9f`](https://github.com/apache/spark/commit/277ba9fc64d94b9245574d5625109dbc3383bae9).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68506 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68506/consoleFull)** for PR 13706 at commit [`e895a9c`](https://github.com/apache/spark/commit/e895a9c7b89d2a53f6747f1e7fa08f8e97b80ed4).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @lianhuiwang Are you still working on this PR? Could you please address the conflicts? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @hvanhovell Yes, I will update later. Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77466 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77466/testReport)** for PR 13706 at commit [`ad85109`](https://github.com/apache/spark/commit/ad851098de14a105846f9f060d0e3a3b26df266a).
     * This patch **fails to build**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67635827
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
    @@ -590,6 +592,53 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder {
       }
     
       /**
    +   * Create a [[CreateMacroCommand]] command.
    +   *
    +   * For example:
    +   * {{{
    +   *   CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    +   * }}}
    +   */
    +  override def visitCreateMacro(ctx: CreateMacroContext): LogicalPlan = withOrigin(ctx) {
    +    val arguments = Option(ctx.colTypeList).map(visitColTypeList(_))
    +      .getOrElse(Seq.empty[StructField]).map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = arguments.map(_.name).zipWithIndex.toMap
    --- End diff --
    
    @hvanhovell  So i think i will create a new Wrapper class to avoid unresolved exception in order to DataFrame can reuse this feature later.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77527/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68485 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68485/consoleFull)** for PR 13706 at commit [`5550496`](https://github.com/apache/spark/commit/5550496617230e46b0e3139c85ba01eed5184114).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460480
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,94 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis.{FunctionRegistry, UnresolvedAttribute}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * This class provides arguments and body expression of the macro.
    + */
    +case class MacroFunctionWrapper(arguments: Seq[AttributeReference], body: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(macroName: String, macroFunction: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val inputSet = AttributeSet(macroFunction.arguments)
    +    val colNames = macroFunction.arguments.map(_.name)
    +    val colToIndex: Map[String, Int] = colNames.zipWithIndex.toMap
    +    macroFunction.body.transformUp {
    +      case u @ UnresolvedAttribute(nameParts) =>
    +        assert(nameParts.length == 1)
    +        colToIndex.get(nameParts.head).getOrElse(
    +          throw new AnalysisException(s"Cannot create temporary macro '$macroName', " +
    +            s"cannot resolve: [${u}] given input columns: [${inputSet}]"))
    +        u
    +      case _: SubqueryExpression =>
    +        throw new AnalysisException(s"Cannot create temporary macro '$macroName', " +
    +          s"cannot support subquery for macro.")
    +    }
    +
    +    val macroInfo = macroFunction.arguments.mkString(",") + "->" + macroFunction.body.toString
    +    val info = new ExpressionInfo(macroInfo, macroName)
    +    val builder = (children: Seq[Expression]) => {
    +      if (children.size != colNames.size) {
    +        throw new AnalysisException(s"actual number of arguments: ${children.size} != " +
    +          s"expected number of arguments: ${colNames.size} for Macro $macroName")
    +      }
    +      macroFunction.body.transformUp {
    +        case u @ UnresolvedAttribute(nameParts) =>
    +          assert(nameParts.length == 1)
    +          colToIndex.get(nameParts.head).map(children(_)).getOrElse(
    +            throw new AnalysisException(s"Macro '$macroInfo' cannot resolve '$u' " +
    +              s"given input expressions: [${children.mkString(",")}]"))
    +      }
    +    }
    +    catalog.createTempFunction(macroName, info, builder, ignoreIfExists = false)
    +    Seq.empty[Row]
    +  }
    +}
    +
    +/**
    + * The DDL command that drops a macro.
    + * ifExists: returns an error if the macro doesn't exist, unless this is true.
    + * {{{
    + *    DROP TEMPORARY MACRO [IF EXISTS] macro_name;
    + * }}}
    + */
    +case class DropMacroCommand(macroName: String, ifExists: Boolean)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    --- End diff --
    
    This will drop any function... Can we make it Macro specific?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844485
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala ---
    @@ -716,6 +716,37 @@ class SparkSqlAstBuilder(conf: SQLConf) extends AstBuilder(conf) {
       }
     
       /**
    +   * Create a [[CreateMacroCommand]] command.
    +   *
    +   * For example:
    +   * {{{
    +   *   CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    +   * }}}
    +   */
    +  override def visitCreateMacro(ctx: CreateMacroContext): LogicalPlan = withOrigin(ctx) {
    +    val arguments = Option(ctx.colTypeList).map(visitColTypeList(_))
    --- End diff --
    
    Nit: you can avoid `(_)`...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77472 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77472/testReport)** for PR 13706 at commit [`4ee32e9`](https://github.com/apache/spark/commit/4ee32e95e61015fe608884d3096ac82a781dd767).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68506 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68506/consoleFull)** for PR 13706 at commit [`e895a9c`](https://github.com/apache/spark/commit/e895a9c7b89d2a53f6747f1e7fa08f8e97b80ed4).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #75958 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/75958/consoleFull)** for PR 13706 at commit [`e895a9c`](https://github.com/apache/spark/commit/e895a9c7b89d2a53f6747f1e7fa08f8e97b80ed4).
     * This patch **fails Spark unit tests**.
     * This patch **does not merge cleanly**.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @lianhuiwang thanks for updating the PR. Could you implement the Macro removal by pattern matching on a (to be created) `MacroFunctionBuilder` class. I feel this is simpler, and doesn't touch as much of the API's.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60654 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60654/consoleFull)** for PR 13706 at commit [`0b93636`](https://github.com/apache/spark/commit/0b93636c941fd3093ba9b93e49e75211aa077c90).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60781 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60781/consoleFull)** for PR 13706 at commit [`808a5fa`](https://github.com/apache/spark/commit/808a5fa509392d8e2b909020f4a711c4dc2437b5).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67635570
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,69 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    columns: Seq[AttributeReference],
    +    macroFunction: Expression)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val macroInfo = columns.mkString(",") + " -> " + macroFunction.toString
    +    val info = new ExpressionInfo(macroInfo, macroName)
    +    val builder = (children: Seq[Expression]) => {
    +      if (children.size != columns.size) {
    +        throw new AnalysisException(s"Actual number of columns: ${children.size} != " +
    +          s"expected number of columns: ${columns.size} for Macro $macroName")
    +      }
    +      macroFunction.transformUp {
    +        case b: BoundReference => children(b.ordinal)
    --- End diff --
    
    @hvanhovell good points. Because Analyzer will check expression's checkInputDataTypes after ResolveFunctions, I think we do not validate input type here. Now i do not think it has benefits if we did casts, but it maybe cause unnecessary casts. I will add some comments for it. Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/60841/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    cc @gatorsmile could you take a look at the way this interacts with the session catalog and the function registry?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68486 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68486/consoleFull)** for PR 13706 at commit [`9fe1881`](https://github.com/apache/spark/commit/9fe1881ffe2810ec445f0560a7920c167c22b2d7).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77544 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77544/testReport)** for PR 13706 at commit [`4d8e843`](https://github.com/apache/spark/commit/4d8e843fb490845b8e5b55033ccac9bba93b7591).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r119122863
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = columns.map(_.name).zipWithIndex.toMap
    +    if (colToIndex.size != columns.size) {
    +      throw new AnalysisException(s"Cannot support duplicate colNames " +
    +        s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}")
    +    }
    +    val macroFunction = funcWrapper.macroFunction.transform {
    +      case u: UnresolvedAttribute =>
    +        val index = colToIndex.get(u.name).getOrElse(
    --- End diff --
    
    yes, i will do it, thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by HyukjinKwon <gi...@git.apache.org>.
Github user HyukjinKwon commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @gatorsmile, I just wonder if it is ready to proceed further this PR. It looks the PR you linked is merged properly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77465 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77465/testReport)** for PR 13706 at commit [`1eb23c7`](https://github.com/apache/spark/commit/1eb23c75b0be7b93980b44f0a9fbaab6a489996e).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118987906
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/NoSuchItemException.scala ---
    @@ -52,3 +52,6 @@ class NoSuchPartitionsException(db: String, table: String, specs: Seq[TableParti
     
     class NoSuchTempFunctionException(func: String)
       extends AnalysisException(s"Temporary function '$func' not found")
    +
    +class NoSuchTempMacroException(func: String)
    --- End diff --
    
    Please remove it. For reasons, please see the PR https://github.com/apache/spark/pull/17716. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @hvanhovell I have addressed your comments. Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77463/
    Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77468 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77468/testReport)** for PR 13706 at commit [`fce1121`](https://github.com/apache/spark/commit/fce112147449278f0078321306f1a1fe34ab938b).
     * This patch **fails Spark unit tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/68506/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460169
  
    --- Diff: sql/catalyst/src/main/antlr4/org/apache/spark/sql/catalyst/parser/SqlBase.g4 ---
    @@ -97,6 +97,9 @@ statement
         | CREATE TEMPORARY? FUNCTION qualifiedName AS className=STRING
             (USING resource (',' resource)*)?                              #createFunction
         | DROP TEMPORARY? FUNCTION (IF EXISTS)? qualifiedName              #dropFunction
    +    | CREATE TEMPORARY MACRO macroName=identifier
    --- End diff --
    
    Does Hive also support non-temporary macro's.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/60654/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844322
  
    --- Diff: sql/core/src/test/scala/org/apache/spark/sql/execution/command/DDLSuite.scala ---
    @@ -1516,6 +1516,35 @@ abstract class DDLSuite extends QueryTest with SQLTestUtils {
         )
       }
     
    +  test("create/drop temporary macro") {
    --- End diff --
    
    Can you use `SQLQueryTestSuite` instead?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844675
  
    --- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala ---
    @@ -107,6 +110,14 @@ class SimpleFunctionRegistry extends FunctionRegistry {
         functionBuilders.remove(name).isDefined
       }
     
    +  override def dropMacro(name: String): Boolean = synchronized {
    --- End diff --
    
    A drop function can currently also drop a macro. Can you make sure that this cannot happen?
    
    Maybe we should consolidate this into a single drop function with a `macro` flag. cc @gatorsmile WDYT?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #68485 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/68485/consoleFull)** for PR 13706 at commit [`5550496`](https://github.com/apache/spark/commit/5550496617230e46b0e3139c85ba01eed5184114).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77463 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77463/testReport)** for PR 13706 at commit [`3d05e4f`](https://github.com/apache/spark/commit/3d05e4f3509d32fa85618bfb475b648261a0694f).
     * This patch **fails Scala style tests**.
     * This patch merges cleanly.
     * This patch adds the following public classes _(experimental)_:
      * `case class AnalysisContext(`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by lianhuiwang <gi...@git.apache.org>.
Github user lianhuiwang commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r119124256
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = columns.map(_.name).zipWithIndex.toMap
    +    if (colToIndex.size != columns.size) {
    +      throw new AnalysisException(s"Cannot support duplicate colNames " +
    +        s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}")
    +    }
    +    val macroFunction = funcWrapper.macroFunction.transform {
    +      case u: UnresolvedAttribute =>
    +        val index = colToIndex.get(u.name).getOrElse(
    +          throw new AnalysisException(s"Cannot find colName: ${u} " +
    +            s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}"))
    +        BoundReference(index, columns(index).dataType, columns(index).nullable)
    +      case u: UnresolvedFunction =>
    +        sparkSession.sessionState.catalog.lookupFunction(u.name, u.children)
    +      case s: SubqueryExpression =>
    +        throw new AnalysisException(s"Cannot support Subquery: ${s} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +      case u: UnresolvedGenerator =>
    +        throw new AnalysisException(s"Cannot support Generator: ${u} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +    }
    +
    +    val macroInfo = columns.mkString(",") + " -> " + funcWrapper.macroFunction.toString
    +    val info = new ExpressionInfo(macroInfo, macroName, true)
    +    val builder = (children: Seq[Expression]) => {
    +      if (children.size != columns.size) {
    +        throw new AnalysisException(s"Actual number of columns: ${children.size} != " +
    +          s"expected number of columns: ${columns.size} for Macro $macroName")
    +      }
    +      macroFunction.transform {
    +        // Skip to validate the input type because check it at runtime.
    --- End diff --
    
    Yes, Now i update it with you ideas. Thanks.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77533 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77533/testReport)** for PR 13706 at commit [`1563f12`](https://github.com/apache/spark/commit/1563f12d78a9c32bf4bed69cb9f86a7d00eb18ef).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #77527 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77527/testReport)** for PR 13706 at commit [`b539e94`](https://github.com/apache/spark/commit/b539e94eae58847c9da13a3cb94932b17ea2fc6e).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460294
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,94 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis.{FunctionRegistry, UnresolvedAttribute}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * This class provides arguments and body expression of the macro.
    + */
    +case class MacroFunctionWrapper(arguments: Seq[AttributeReference], body: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(macroName: String, macroFunction: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val inputSet = AttributeSet(macroFunction.arguments)
    +    val colNames = macroFunction.arguments.map(_.name)
    +    val colToIndex: Map[String, Int] = colNames.zipWithIndex.toMap
    +    macroFunction.body.transformUp {
    +      case u @ UnresolvedAttribute(nameParts) =>
    +        assert(nameParts.length == 1)
    +        colToIndex.get(nameParts.head).getOrElse(
    +          throw new AnalysisException(s"Cannot create temporary macro '$macroName', " +
    +            s"cannot resolve: [${u}] given input columns: [${inputSet}]"))
    +        u
    --- End diff --
    
    Why not replace this by a `BoundReference`? Then we don't need to do a lookup in the map, every time the macro gets used.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60840 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60840/consoleFull)** for PR 13706 at commit [`f4ed3bc`](https://github.com/apache/spark/commit/f4ed3bc13cbc629d055ef74a127cf212217cd589).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    @lianhuiwang could you bring this up to date? I would love to get this in.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r118844444
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,99 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis._
    +import org.apache.spark.sql.catalyst.expressions._
    +import org.apache.spark.sql.types.StructField
    +
    +/**
    + * This class provides arguments and body expression of the macro function.
    + */
    +case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(
    +    macroName: String,
    +    funcWrapper: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val columns = funcWrapper.columns.map { col =>
    +      AttributeReference(col.name, col.dataType, col.nullable, col.metadata)() }
    +    val colToIndex: Map[String, Int] = columns.map(_.name).zipWithIndex.toMap
    +    if (colToIndex.size != columns.size) {
    +      throw new AnalysisException(s"Cannot support duplicate colNames " +
    +        s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}")
    +    }
    +    val macroFunction = funcWrapper.macroFunction.transform {
    +      case u: UnresolvedAttribute =>
    +        val index = colToIndex.get(u.name).getOrElse(
    +          throw new AnalysisException(s"Cannot find colName: ${u} " +
    +            s"for CREATE TEMPORARY MACRO $macroName, actual columns: ${columns.mkString(",")}"))
    +        BoundReference(index, columns(index).dataType, columns(index).nullable)
    +      case u: UnresolvedFunction =>
    +        sparkSession.sessionState.catalog.lookupFunction(u.name, u.children)
    +      case s: SubqueryExpression =>
    +        throw new AnalysisException(s"Cannot support Subquery: ${s} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +      case u: UnresolvedGenerator =>
    +        throw new AnalysisException(s"Cannot support Generator: ${u} " +
    +          s"for CREATE TEMPORARY MACRO $macroName")
    +    }
    +
    +    val macroInfo = columns.mkString(",") + " -> " + funcWrapper.macroFunction.toString
    --- End diff --
    
    Can you give an example of what this would look like?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #13706: [SPARK-15988] [SQL] Implement DDL commands: Creat...

Posted by hvanhovell <gi...@git.apache.org>.
Github user hvanhovell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13706#discussion_r67460323
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/command/macros.scala ---
    @@ -0,0 +1,94 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.sql.execution.command
    +
    +import org.apache.spark.sql.{AnalysisException, Row, SparkSession}
    +import org.apache.spark.sql.catalyst.analysis.{FunctionRegistry, UnresolvedAttribute}
    +import org.apache.spark.sql.catalyst.expressions._
    +
    +/**
    + * This class provides arguments and body expression of the macro.
    + */
    +case class MacroFunctionWrapper(arguments: Seq[AttributeReference], body: Expression)
    +
    +/**
    + * The DDL command that creates a macro.
    + * To create a temporary macro, the syntax of using this command in SQL is:
    + * {{{
    + *    CREATE TEMPORARY MACRO macro_name([col_name col_type, ...]) expression;
    + * }}}
    + */
    +case class CreateMacroCommand(macroName: String, macroFunction: MacroFunctionWrapper)
    +  extends RunnableCommand {
    +
    +  override def run(sparkSession: SparkSession): Seq[Row] = {
    +    val catalog = sparkSession.sessionState.catalog
    +    val inputSet = AttributeSet(macroFunction.arguments)
    +    val colNames = macroFunction.arguments.map(_.name)
    +    val colToIndex: Map[String, Int] = colNames.zipWithIndex.toMap
    +    macroFunction.body.transformUp {
    +      case u @ UnresolvedAttribute(nameParts) =>
    +        assert(nameParts.length == 1)
    --- End diff --
    
    Why? The only thing that matters is that the name is in parameter list.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by gatorsmile <gi...@git.apache.org>.
Github user gatorsmile commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    Sure. Will do it. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #13706: [SPARK-15988] [SQL] Implement DDL commands: Create/Drop ...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the issue:

    https://github.com/apache/spark/pull/13706
  
    **[Test build #60840 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/60840/consoleFull)** for PR 13706 at commit [`f4ed3bc`](https://github.com/apache/spark/commit/f4ed3bc13cbc629d055ef74a127cf212217cd589).
     * This patch **fails Spark unit tests**.
     * This patch **does not merge cleanly**.
     * This patch adds the following public classes _(experimental)_:
      * `case class MacroFunctionWrapper(columns: Seq[StructField], macroFunction: Expression)`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org