You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/01/10 04:21:02 UTC

[GitHub] [spark] HyukjinKwon commented on pull request #30984: [SPARK-33915][SQL] Allow json expression to be pushable column

HyukjinKwon commented on pull request #30984:
URL: https://github.com/apache/spark/pull/30984#issuecomment-757409794


   @tedyu, the special characters are not allowed in some sources such as Hive as you tested. However, they are allowed in some other sources when you use DSLs:
   
   ```scala
   scala> spark.range(1).toDF("GetJsonObject(phone#37,$.phone)").write.option("header", true).mode("overwrite").csv("/tmp/foo")
   
   scala> spark.read.option("header", true).csv("/tmp/foo").show()
   +-------------------------------+
   |GetJsonObject(phone#37,$.phone)|
   +-------------------------------+
   |                              0|
   +-------------------------------+
   ```
   
   In this case, the filters will still be pushed down to the datasource implementation site. We should have a way to identify if the pushed `GetJsonObject(phone#37,$.phone)` is a field or expression pushed down.
   
   This makes me believe the current implementation based on strings are flaky, and incomplete. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org