I have a use case where there are over 50+ lookup files that I need to 'sync' between one app context and another. The idea is to:
1) read the lookup from context of App1 search bar
2) outputlookup to a lookup file named 'UPDATE_<lookupname>.csv' that resides in App2 context
The idea is to have the 50+ lookup file names in a lookup named myLookupFiles with App1. Then pass that filename as a field into a macro. So the gist would be to:
| inputlookup myLookupFiles
| `mySyncMacro(myLookupFileNameField)`
And the macro would thus then be something like:
join type=left max=0
[| inputlookup $myLookupFileName$]
| fields - myLookupFileNameField
| outputlookup createinapp=true UPDATE_$myLookupFileName$
| search blarg
Which, of course, doesn't work.
Thoughts on a way to iterate across all 50+ file names when they are specified as values within a table to create the 50+ named lookup files with the name "UPDATE_<lookupname>.csv"?
what about map? Just a simple example here, but should give you the idea I think.
| inputlookup myLookupFiles
| map search="| inputlookup $filenamefield$ | outputlookup UPDATE_$filenamefield$"
what about map? Just a simple example here, but should give you the idea I think.
| inputlookup myLookupFiles
| map search="| inputlookup $filenamefield$ | outputlookup UPDATE_$filenamefield$"
I haven't used map much in the past, so didn't think of it. Thanks for the reminder! Had to set the 'maxsearches' option so that I didn't hit the default limit of 10. And did a '| search blarg' in the map subsearch pipeline just to make sure the results set was cleared out before going on to the next one (not necessary, just a sanity check). Worked well, but NOT recommended if you have 1,000's of lookups all at a HUGE size - I/O spike!!! 🙂 Otherwise, works quite well for this limited sync use case. Thanks!