Request - Bulk Deleting Stored Data

When deleting items from the stored data you can only delete one at a time or all at once. Both have their uses but it leaves a gap.
If you have a large database (1000s of items) and need to delete a few hundred, the only way to do it is one at a time.
Alternately, if you have to delete everything to reload, as soon as you run it all new data is stored and nothing is processed.

Because of that I propose a few possible changes.

  1. When running for the first time, have a checkbox to allow all data to be processed
    OR
  2. Allow for a “dummy row” to be created on an empty database to allow all new rows to be processed. Row could be created manually or by apiant. just a placeholder really.
    Either I think would resolve the issue there

Also…
3. add the ability to delete all rows on a page within the stored data. This would allow someone to filter on a value and then delete by the page instead of a single row at a time. when deleting many lines to reprocess this would help immensely.
4. with that, allow dynamic page size maybe up to 50 or so instead of limiting it to 10 rows. ability to see more rows would help if you could delete by page.
5. allow searching/filtering by system created date.

I dont know about anyone else but these would help on my end quite often, especially during testing.

Hi Brent,

Thanks for the feedback!

For #1 & 2, you can use the existing “export mode” option after choosing to execute the automation manually to process all data emitted by the trigger:

Being able to delete stored data based on a search sounds good, plus searching by date ranges. Adjusting the page size also sounds helpful. Will try to add these into the next release.

Robert

ok. I always assumed that was just to actually export the retrieved data.

Thanks

I understand. The typical scenario for wanting to process all output trigger data is for export/import from the trigger app into an action app.

The other benefit of the export option is that all timeouts will be ignored. The automation will be allowed to process to completion.

if you run export, does it ignore any existing data rows and process everything? Hypothetically you could process duplicates?

It ignores everything and processes whatever is emitted by the trigger.

ok that is great. the reason all this came up is testing for a client. we made some changes to an automation and want to load a dataset that we already processed. thought deleting line by line would be the only way.

appreciate it

quick follow up note. when searching for a value it seems as it has to be the full value, not just partial.

for example

If I search for the first 6 digits of the first line there (102005) it returns nothing.

if it would return all rows containing that value that would be ideal, not just 100% matches.

Sounds good, will also add that to the next system release.

1 Like

The new functionality has been added into the next release:

  • can now search for entries by date
  • can select the number of rows to show
  • can now delete entries in bulk via a “delete all” button
  • searching matches on any portion of the entered text