You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/12/02 14:27:10 UTC

[GitHub] [spark] KevinAppelBofa commented on pull request #34693: [SPARK-37259][SQL] Support CTE and TempTable queries with MSSQL JDBC

KevinAppelBofa commented on pull request #34693:
URL: https://github.com/apache/spark/pull/34693#issuecomment-984676447


   @attilapiros Finding in the query where the WHEN piece stops, then where the SELECT begins is where I found the place to split.  In the test query 
   `query2 = """
   WITH DummyCTE AS
   (
   SELECT 1 as DummyCOL
   )
   SELECT *
   FROM DummyCTE
   """`
   
   This splits into
   `withClause = """
   WITH DummyCTE AS
   (
   SELECT 1 as DummyCOL
   )
   """
   query = """
   SELECT *
   FROM DummyCTE
   """`
   
   In the actual query we are running is more complex and is a bunch of chained WHEN together, in that one I did the same approach and where the actual WHEN part ends to stick that into the whenClause and then the rest into the query
   
   This same technique works for the temp table query, to split it up where the part generating the temp table goes into the whenClause and the rest goes into the query
   
   `query3 = """
   (SELECT *
   INTO #Temp1a
   FROM
   (SELECT @@VERSION as version) data
   )
   
   (SELECT *
   FROM
   #Temp1a)
   """`
   
   Turns into
   `withClause = """
   (SELECT *
   INTO #Temp1a
   FROM
   (SELECT @@VERSION as version) data
   )
   """
   query = """
   (SELECT *
   FROM
   #Temp1a)
   """`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org