You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (Jira)" <ji...@apache.org> on 2019/12/29 06:08:00 UTC

[jira] [Updated] (SPARK-28427) Support more Postgres JSON functions

     [ https://issues.apache.org/jira/browse/SPARK-28427?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Takeshi Yamamuro updated SPARK-28427:
-------------------------------------
    Parent Issue: SPARK-30375  (was: SPARK-27764)

> Support more Postgres JSON functions
> ------------------------------------
>
>                 Key: SPARK-28427
>                 URL: https://issues.apache.org/jira/browse/SPARK-28427
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Josh Rosen
>            Priority: Major
>
> Postgres features a number of JSON functions that are missing in Spark: https://www.postgresql.org/docs/9.3/functions-json.html
> Redshift's JSON functions (https://docs.aws.amazon.com/redshift/latest/dg/json-functions.html) have partial overlap with the Postgres list.
> Some of these functions can be expressed in terms of compositions of existing Spark functions. For example, I think that {{json_array_length}} can be expressed with {{cardinality}} and {{from_json}}, but there's a caveat related to legacy Hive compatibility (see the demo notebook at https://databricks-prod-cloudfront.cloud.databricks.com/public/4027ec902e239c93eaaa8714f173bcfc/5796212617691211/45530874214710/4901752417050771/latest.html for more details).
> I'm filing this ticket so that we can triage the list of Postgres JSON features and decide which ones make sense to support in Spark. After we've done that, we can create individual tickets for specific functions and features.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org