You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (JIRA)" <ji...@apache.org> on 2016/07/11 16:20:11 UTC

[jira] [Commented] (SPARK-15816) SQL server based on Postgres protocol

    [ https://issues.apache.org/jira/browse/SPARK-15816?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15371091#comment-15371091 ] 

Takeshi Yamamuro commented on SPARK-15816:
------------------------------------------

I'm working on this prototype: https://github.com/apache/spark/compare/master...maropu:SPARK-15816
Currently, I'm checking that we can map all the implemented types of SparkSQL including nested ones.

> SQL server based on Postgres protocol
> -------------------------------------
>
>                 Key: SPARK-15816
>                 URL: https://issues.apache.org/jira/browse/SPARK-15816
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Reynold Xin
>
> At Spark Summit today this idea came up from a discussion: it would be great to investigate the possibility of implementing a new SQL server using Postgres' protocol, in lieu of Hive ThriftServer 2. I'm creating this ticket to track this idea, in case others have feedback.
> This server can have a simpler architecture, and allows users to leverage a wide range of tools that are already available for Postgres (and many commercial database systems based on Postgres).
> Some of the problems we'd need to figure out are:
> 1. What is the Postgres protocol? Is there an official documentation for it?
> 2. How difficult would it be to implement that protocol in Spark (JVM in particular).
> 3. How does data type mapping work?
> 4. How does system commands work? Would Spark need to support all of Postgres' commands?
> 5. Any restrictions in supporting nested data?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org