You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by grantnicholas <gi...@git.apache.org> on 2018/01/18 16:36:23 UTC

[GitHub] spark issue #15726: [SPARK-18107][SQL][FOLLOW-UP] Insert overwrite statement...

Github user grantnicholas commented on the issue:

    https://github.com/apache/spark/pull/15726
  
    @viirya @yuananf a quick check of recent spark releases shows this fix is not in. Any suggested workarounds in the meantime for dynamic partition insert overwrites?
    
    It sounds like if the user does the logic of deleting the necessary partitions before running the dynamic insert overwrite query then hive will go down the "happy" performant path. This will require calculating the dynamic partitions before running the insert query, but if you can do that then this workaround will work right?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org