Updating stored CSV

I currently have an automation set up that read records from one system, does a few lookups, and then writes to another. those lookups take a few loops per record which really slows things down and bogs down the api its calling. I have used the file-csv module to store a sort of csv mapping file for this exact use case but what i have found is if it has records that need to be added regularly, i cant just leave that to the client to do. Instead we have it stored in a table on the source system and like mentioned above, doing lookups.

Ideally we could still do that but instead of looking up every time there, we duplicate the table into a csv in apiant. so is there a way to update stored csv from one automation that is referenced in another automation? then i could just keep the csv in sync, run the automation, and now all my lookups are internal instead of all the extra api calls. If not, is there a way to store a table in apiant that is also updatable and referenceable by multiple automations?

Sounds like you should use the Lookup Table app for that, which will use the system’s database for storage. Store the data in “account” scope so multiple automations can read/write the data.

I was looking at that but wasnt sure how i would use it for my instance. So i have a table with 5 values per row. i need to use 2 of those values to retrieve the other 3.

is there a sample automation that i could look at for this?

The Lookup Tables data storage is basically key-value, but with 5 levels of groupings.

So if you need to lookup rows by 2 values, I would concatenate them together (with a separator character(s) in between so you can split() them later if needed) to form the lookup key. Then each of the 3 other values would be stored separately.

Sorry, we don’t have anything in the way of sample automations. Is best to experiment.

This may be helpful to help visualize how the data is stored in the database:

The Lookup Tables actions will set the scope and scope_uuid based on your settings selections (use account scope so you can access data across multiple automations). Otherwise the columns match up with the inputs for the Lookup Tables actions.

ok so for example here is a data set that i would be using (in reality it is about 5k rows)
so I need to use the legacy account and company to retrieve the other 3 fields.
by what you are describing I would need to concat those two fields into one and then store that as “the key”, each of the other 3 field names as “the value name” for 3 records and the related field value as “the_value”
so the first row from above would be like so…
then when i am ready to retrieve it I would use the keygroup, key, and value name to retrieve the value.

Am I understanding this right?

Yes! That is how I envisioned it based on what you described wanting to do.

The database table is heavily indexed, so lookup performance will be fast with just 5k rows. Our main database at apiant.com has many millions of rows in that table.

You may want to build some type of “loader” automation that loads data into the database from the existing CSV file. You would run that once manually to load the CSV data into the system’s database.

To view and manage data loaded into the database at the account level scope, right click in the background of the dashboard and choose this menu option:

Then you will be able to view and manage the data:

great. is this only built on the action level? are there any subassemblies that can be called to do these lookups?

You can inspect the Lookup Table action assemblies. They use Simple DB modules for the database interactions.

1 Like