Storing one time use lookup table in memory

Looking for the best way to handle this. I often have to loop my data row by row and do a lookup call into a system. obviously is there are 100s of rows or more this can get massive. i would rather pull the entire table (or whatever subset i need) and store it in memory for that one automation run, then use that stored table as a lookup table. then at the automation end the table is cleared. two ways i thought of handling this would be to use the built in look up tables to store it but seemed like that might be overkill OR I have also used a stored csv in an automation as a permanent lookup, thought possibly doing something like that to do the same but didnt see a real way to store it in a sub and call it repeatedly. i assume you have come across this before and might have a better (aka more efficient) way of handling this.

There is currently no functionality for in-memory data storage while an automation processes. There are many reasons why in-memory data storage would be impractical.

The recommended solution is to use Lookup Table actions which are backed by database storage.

Each Lookup Table action generally takes 20ms or less, so there isn’t a huge amount of overhead in using the database for storage.

Thanks. didnt know if there was a better solution.

Sure, let us know if for some reason the Lookup Table actions installed on your system are not sufficient and make solving a problem harder than you think it should be. We may be able to provide variation actions to help.

so i did this before (ec09cc334f6d46ec8501e1e75c90cbc7) which stores a csv file and uses that as a repeated lookup table. worked fantastic. didnt know if there was a similar way to do that with the data i get returned. im sure the lookup tables will work fine though too since that is what they are intended to do. i guess i was focused more on once that instance of the automation finishes, i dont need anything stored anymore so i would have to call the lookup tables to purge. i know they can be created as temp but they dont purge immediately. so if the purge fails and the automation runs again, populates the lookup, there could be potential overlap

Here’s how I would handle that:

  1. As an initial step of actions, create a random value via the “generate random value” action.

  2. For all lookup tables, use a keygroup of “temp_{random}_myname” where {random} is mapped to the output of the random value action and myname helps to identify the data so I can find and manage the data via the “Manage Account Data” account menu if ever needed.

In this manner, the data is scoped to every single automation execution plus it is temporary and will eventually be purged from the database.

got it. so initial action to build the table then a subassembly within the next action to do the lookup as i go through my data. thanks

If you want to use subassemblies, then I would use a text value action to build up the keygroup name, then use that emitted value to map to all the lookup table actions plus subassemblies that need to reference the keygroup.

sorry, to clarify, initial action to build the table. 2nd action to pull my actual data im processing and subassembly there to do the lookups. within that action i need to do a bit of data cleanup and parsing and then rebuild the data structure so i assumed a sub within that action would be the cleanest way to handle it before pushing it down the line. i have used a subassembly before to look up values from a table so shouldnt be an issue.

sure, I was just suggesting a way to build the keygroup name one time in order to keep it consistent to avoid possibly making typo mistakes