You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/05/12 07:42:00 UTC
[jira] [Resolved] (SPARK-35299) Dataframe overwrite on S3 does not
delete old files with S3 object-put to table path
[ https://issues.apache.org/jira/browse/SPARK-35299?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-35299.
----------------------------------
Resolution: Incomplete
Spark 2.2.0 is EOL. Please try higher versions of other Spark and see if the issue persists.
> Dataframe overwrite on S3 does not delete old files with S3 object-put to table path
> ------------------------------------------------------------------------------------
>
> Key: SPARK-35299
> URL: https://issues.apache.org/jira/browse/SPARK-35299
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.2.0
> Reporter: Yusheng Ding
> Priority: Major
> Labels: aws-s3, dataframe, hive, spark
>
> To reproduce:
> test_table path: s3a://test_bucket/test_table/
>
> df = spark_session.sql("SELECT * FROM test_table")
> df.count() # produce row number 1000
> #####S3 operation######
> s3 = boto3.client("s3")
> s3.put_object(
> Bucket="test_bucket", Body="", Key=f"test_table/"
> )
> #####S3 operation######
> df.write.insertInto(test_table, overwrite=True)
> #Same goes to df.write.save(mode="overwrite", format="parquet", path="s3a://test_bucket/test_table")
> df = spark_session.sql("SELECT * FROM test_table")
> df.count() # produce row number 2000
>
> Overwrite is not functioning correctly. Old files will not be deleted on S3.
>
>
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org