You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jeremy Beard (JIRA)" <ji...@apache.org> on 2016/04/20 21:11:25 UTC

[jira] [Commented] (SPARK-14764) Spark SQL documentation should be more precise about which SQL features it supports

    [ https://issues.apache.org/jira/browse/SPARK-14764?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15250526#comment-15250526 ] 

Jeremy Beard commented on SPARK-14764:
--------------------------------------

Ideally I'd be looking for the documentation to define its own syntax rather than effectively say that it's mostly like Hive. Many developers won't have used Hive before and might not be using HiveContext (and in my opinion will also find the Hive documentation rather lacking). There are also a couple of differences that I've personally hit between Hive and Spark SQL that weren't listed on the Spark SQL page (can't insert overwrite to the source table, can't do subqueries in a WHERE clause) but where the documentation isn't really wrong because it isn't claiming to be precise or complete about what SQL features it supports.

> Spark SQL documentation should be more precise about which SQL features it supports
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-14764
>                 URL: https://issues.apache.org/jira/browse/SPARK-14764
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation, SQL
>    Affects Versions: 1.5.0
>            Reporter: Jeremy Beard
>            Priority: Minor
>
> Terminology such as "vast majority" and "most" is difficult to develop against without a lot of trial and error. It would be excellent if the Spark SQL documentation could be more precise about which SQL features it does and doesn't support. In a sense this is part of the API of Spark.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org