Trigger new items handling

I have a trigger that retrieves a set of records based on date range, then summarizes those to create an entry into another system. since each individual records ID gets lost in the summary i cant use that as the trigger unique value. typically we used the date range but found that due to their processes, records for another day may get added after it was already processed.

If i use the record ID as the unique value, then only one record gets pushed out of the trigger at a time and i cannot summarize as an action.

so what i am looking for is an idea to use the record IDs as the unique value BUT still summarize within the trigger OR pass all the records at once to be summarized after (if that is even possible).

Maybe you should compute the MD5 hash of the summary records and use that as the unique value.

See this assembly for an example: ca5c557b227748b68ee6d376c736f1d7

thanks but not sure how that would work.

more detailed example is on the first run i get records 1,2,3, summarizes and processes those. next time it pulls that batch it returns 1,2,3,4,5. since 1-3 were already done, i only need 4-5 to summarize and process. how would the hash know to exclude 1,2,3?

i know the summarizing is the part that is killing the process since i lose the detail of the records. If i concat, hash, etc the detail then adding a single record will be a “new” batch and all previous records would process again as part of the summary.

not sure how i could use the emit new items before the summary process. Or, as an idea, is there a way to use the trigger to retrieve and emit new items, then the first action sends them to a collector to be processed all together to be able to handle the summarizing there?

I thought you would need records 1-3 included in the processing for 4 & 5.

The trigger will emit items based on whether or not the unique item id is in the database or not. If you only want 4 & 5 emitted by the trigger after 1-3 have been processed, then each record id has to be stored in the database and the trigger would have to emit each record individually.

If you can’t do that and instead have to emit the entire batch 1-5 from the trigger, then you can use keyvalue database storage in action logic to determine which records within the batch were already processed.

not ideal but also not sure i have another option.

ok thanks. if you ever make changes to the trigger handling to use the emit new items to filter like normal but then emit all the remaining items as a batch instead of individual from the trigger, let me know. i think then it would be easy to use an action to summarize like i am now in the trigger. i know a few places i could use that. something like a checkbox on the trigger module to “emit all records as a batch” or something.

You can copy and revise assembly modules on your dev server to accomplish custom processing at the module level.

So you can clone the Trigger - Emit New Items module and make a new variation that meets your needs.

All it basically does is to fetch record ids from the database and emit records that are not there. The system natively knows to store trigger record ids into the database when all action processing is successful for data rows emitted by triggers, as determined by this field in the trigger module:

I noticed your dev system wasn’t configured with the permissions to access the module editor, probably b/c it was copied from your main system. Have reconfigured it so you can now access it if you want:

awesome. i was never able to edit system modules so that opens up a world for me.

loaded dev and still says not licensed for module editor.

Sorry, I needed to update the license key as well.

You should have access now, reload the assembly editor first.

The LZX syntax is basically JavaScript. That is used when the module executes in the browser, either in the assembly editor or automation editor. The Java implementation is only used server-side.

Many modules have almost identical LZX & Java implementations. The Module API was designed so that the code looks as similar as possible for the browser and server. Many times the Java code just differs in terms of adding type info to variables.

The easiest way to get started is to examine existing modules and to clone modules that are close to desired functionality.

Data stream processing modules all have a certain boilerplate pattern, in that they validate their configuration, then have two loops that iterate through input wires and then through data streams within each wire. (Almost all modules only have a single wire as input, however.)

Suggest that you first examine an easy data processing module to help understand the code. The Math Hash Functions looks like a good example to examine first. Check out both its LZX and Java implementations. You can quickly examine module code with the Module Browser tab. The items at the top of the catalog comprise the Module API classes.

Develop and test on your dev server. Then publish to your production server when ready (via the “publish” right-click menu in the catalog).

The best development approach is to first get the LZX code working in the browser by running the module in an assembly (use debug mode as needed to trace your code, or console.log) then after the LZX code is working translate it to Java (mostly copy-paste and add type info to variables). Run the assembly on the server to test the Java code:

Your cloned Trigger - Emit New Items module would need to be tested within an automation. Its LZX code does very little b/c it has no real purpose when running in the browser other than to facilitate assembly building.

The “Server-Side Script” module and inline Java/PHP is most often used to extend system functionality in assemblies, but there are times when customizing/building modules is easier and better. There is a runtime performance benefit to using modules b/c inline code and all the input XML has to either be compiled or interpreted on the fly.

Let us know as you need help!

Oh, I don’t remember if your account will be able to save changes to baseline modules, but don’t do it if so.

Our system upgrades would overwrite any changes made to baseline modules.

Generally the very first step you should do is to first save a copy of a module.

1 Like