You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2021/08/12 13:19:04 UTC

[GitHub] [hudi] dongkelun edited a comment on pull request #3380: [HUDI-2259]Support referencing subquery with column aliases by table alias in me…

dongkelun edited a comment on pull request #3380:
URL: https://github.com/apache/hudi/pull/3380#issuecomment-897632707


   @pengzhiwei2018 Hi,When I test Spark3, I find that Spark SQL for Hoodie with Spark3 uses the source code of Spark, but columns aliases in Merge Into is not supported in Spark3, it will throw the following exception: 'Columns aliases are not allowed in MERGE.'.I think there are two solutions, one is to modify the source code of Spark3 to make Spark support, the other is to write code in hudi-spark3 to implement Spark SQL for Hoodie, but I personally feel that this is a big change, I do not know if I understand correctly. So I was hoping you could help with some advice.
   
   ` // org.apache.spark.sql.catalyst.parserAstBuilder
   val sourceTableAlias = getTableAliasWithoutColumnAlias(ctx.sourceAlias, "MERGE")
   private def getTableAliasWithoutColumnAlias(
         ctx: TableAliasContext, op: String): Option[String] = {
       if (ctx == null) {
         None
       } else {
         val ident = ctx.strictIdentifier()
         if (ctx.identifierList() != null) {
           throw new ParseException(s"Columns aliases are not allowed in $op.", ctx.identifierList())
         }
         if (ident != null) Some(ident.getText) else None
       }
     }`


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org