How does deleteRecordsBulk handle large datasets?

I have a table with around 25,000 records that are no longer relevant, and I need to delete them.
Right now, I could do it with something like:

let result = await doo.table.deleteRecordsBulk(
    "exchangeRates",
    25000,
    `baseCurrency(notempty)`
);
console.log("Finished");
console.log(result);

I’d like to make this process as fast and light on the app as possible, and I’m wondering:

  1. Does deleteRecordsBulk automatically handle large numbers of records by processing them internally in batches/chunks, or do I need to implement batching logic on the client side?
  2. If I send multiple deleteRecordsBulk calls in parallel with the same filter, will they target the same records (causing overlap), or does the API handle this to avoid collisions?
  3. Is there a recommended, more efficient approach for deleting large datasets without risking duplicate delete attempts or hitting API throttling limits?

If you have any recommendations or alternative approaches, I’d love to hear them.