When deleting items from the stored data you can only delete one at a time or all at once. Both have their uses but it leaves a gap.
If you have a large database (1000s of items) and need to delete a few hundred, the only way to do it is one at a time.
Alternately, if you have to delete everything to reload, as soon as you run it all new data is stored and nothing is processed.
Because of that I propose a few possible changes.
When running for the first time, have a checkbox to allow all data to be processed
OR
Allow for a “dummy row” to be created on an empty database to allow all new rows to be processed. Row could be created manually or by apiant. just a placeholder really.
Either I think would resolve the issue there
Also…
3. add the ability to delete all rows on a page within the stored data. This would allow someone to filter on a value and then delete by the page instead of a single row at a time. when deleting many lines to reprocess this would help immensely.
4. with that, allow dynamic page size maybe up to 50 or so instead of limiting it to 10 rows. ability to see more rows would help if you could delete by page.
5. allow searching/filtering by system created date.
I dont know about anyone else but these would help on my end quite often, especially during testing.
For #1 & 2, you can use the existing “export mode” option after choosing to execute the automation manually to process all data emitted by the trigger:
Being able to delete stored data based on a search sounds good, plus searching by date ranges. Adjusting the page size also sounds helpful. Will try to add these into the next release.
ok that is great. the reason all this came up is testing for a client. we made some changes to an automation and want to load a dataset that we already processed. thought deleting line by line would be the only way.