Im looking to do a one off automation for some ETL. i have over 140k records to move so i dont want to do that all at one time. Ideally i could have the trigger maintain an offset so the first time it runs 1000, next run it knows so it offsets 1000, next offset 2000, etc etc.
Is there a way to maintain that count within the trigger and not having to manually do it every run?
The typical mechanism for throttling a polling trigger’s number of output data rows is the “max items emitted” value in the Trigger - Emit New Items module:
When using this mechanism, the trigger would fetch all data rows from the API. Then each time the trigger runs, only “max items emitted” data rows are processed. As data rows are successfully processed, their unique ids are saved in the database, so that the next time the trigger runs the Trigger - Emit New Items will only emit data rows that have not been previously processed.
If you can’t fetch 140k records from the API all at once, you will have to keep a counter of some sort to control pagination.
One approach is to do that manually via a Parameter module, where you edit the automation and adjust the value in the trigger’s settings manually before running the automation.
Another approach is to build your own counter stored in the database.
Take a look at the Lookup Table actions here:
You could use the Save Value as an action in your automation, where the last step performed on a processed data row is to increment the count.
Then in the trigger, embed the Get Value action by dragging it out of the catalog and use it to fetch the saved counter:
yeah i can only pull 1000 records at a time. i thought lookup tables might be the way i would have to handle it. didnt know if something already existed for this type of pagination.