You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@jackrabbit.apache.org by "Amit Jain (JIRA)" <ji...@apache.org> on 2014/03/18 07:05:49 UTC

[jira] [Commented] (JCR-3750) Add a batch delete method for the data stores.

    [ https://issues.apache.org/jira/browse/JCR-3750?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13938864#comment-13938864 ] 

Amit Jain commented on JCR-3750:
--------------------------------

Hi [~tmueller], [~chetanm],

My proposal is to add the following method in the DataStore and Backend API
{code}void deleteRecords(List<DataIdentifier> ids, long maxModifiedTime){code}.

Implementations will then handle the batch deletion in the best possible way.

Currently, DataStore does not have an API for deletion based on an identifier but MultiDataStoreAware has it. I find it preferable to add it to the DataStore, but if you think its better to add it to the MultiDataStoreAware, then I'll use that.

> Add a batch delete method for the data stores.
> ----------------------------------------------
>
>                 Key: JCR-3750
>                 URL: https://issues.apache.org/jira/browse/JCR-3750
>             Project: Jackrabbit Content Repository
>          Issue Type: Bug
>          Components: jackrabbit-data
>    Affects Versions: 2.7.5
>            Reporter: Amit Jain
>
> Currently, MutiDataStoreAware exposes a {code}deleteRecord(DataIdentifier identifier){code} method to delete a blob given an id. 
> The proposal is to add a new delete method which can take in a batch of ids and take advantage of the underlying backends capability to delete in batches (if supported). Backends like S3 and DB can handle batch delete.
> This will help in delete performance when a large number of deletes are to happen for ex. during garbage collection.



--
This message was sent by Atlassian JIRA
(v6.2#6252)