You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/04/04 03:15:00 UTC

[jira] [Commented] (SPARK-34928) CTE Execution fails for Sql Server

    [ https://issues.apache.org/jira/browse/SPARK-34928?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17314389#comment-17314389 ] 

Hyukjin Kwon commented on SPARK-34928:
--------------------------------------

[~supun.t.desilva], how does this relate to {{net.sourceforge.jtds.jdbc.Driver}} driver? Does the query not work in Apache Spark or doesn't work when you use that driver?

> CTE Execution fails for Sql Server
> ----------------------------------
>
>                 Key: SPARK-34928
>                 URL: https://issues.apache.org/jira/browse/SPARK-34928
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.0.1
>            Reporter: Supun De Silva
>            Priority: Minor
>
> h2. Issue
> We have a simple Sql statement that we intend to execute on SQL Server. This has a CTE component.
> Execution of this yields to an error that looks like follows
> {code:java}
> java.sql.SQLException: Incorrect syntax near the keyword 'WITH'.{code}
> We are using the jdbc driver *net.sourceforge.jtds.jdbc.Driver* (version 1.3.1)
> This is a particularly annoying issue and due to this we are having to write inner queries that are fair bit inefficient.
> h2. SQL statement
> (not the actual one but a simplified version with renamed parameters)
>  
> {code:sql}
> WITH OldChanges as (
>    SELECT distinct 
>         SomeDate,
>         Name
>    FROM [dbo].[DateNameFoo] (nolock)
>    WHERE SomeDate!= '2021-03-30'
>        AND convert(date, UpdateDateTime) = '2021-03-31'
> SELECT * from OldChanges {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org