You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yusheng Ding (Jira)" <ji...@apache.org> on 2021/05/04 00:22:00 UTC

[jira] [Created] (SPARK-35299) Dataframe overwrite on S3 does not delete old files with S3 object-put to table path

Yusheng Ding created SPARK-35299:
------------------------------------

             Summary: Dataframe overwrite on S3 does not delete old files with S3 object-put to table path
                 Key: SPARK-35299
                 URL: https://issues.apache.org/jira/browse/SPARK-35299
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.0
            Reporter: Yusheng Ding


To reproduce:

test_table path: s3a://test_bucket/test_table/

 

df = spark_session.sql("SELECT * FROM test_table")

df.count()  # produce row number 1000

#####S3 operation######

s3 = boto3.client("s3")
s3.put_object(
    Bucket="test_bucket", Body="", Key=f"test_table/"
)

#####S3 operation######

df.write.insertInto(test_table, overwrite=True)

# Same goes to df.write.save(mode="overwrite", format="parquet", path="s3a://test_bucket/test_table")

df = spark_session.sql("SELECT * FROM test_table")

df.count()  # produce row number 2000

 

Overwrite is not functioning correctly. Old files will not be deleted on S3.

 

 

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org