Splunk Search

updating a lookup after query result returned

proylea
Contributor

I have a lookup table that looks like this

_time,action,source
<time>,completed,<source>
<time>,completed,<source>
<time>,ready,<source>    

I use this lookup for selecting the source with action=ready when running another query, like this

index=* [| inputlookup pr-test2.csv | search action=ready | fields - _time,action] | rest of the query.............

I would like to update, or rather overwrite the lookup with the same query once the source has been used to look like this

_time,action,source
<time>,completed,<source>
<time>,completed,<source>
<time>,completed,<source>

This is a csv lookup not a kvstore, I don't have command line access to update collections.conf so I'm stuck with this for now.
Then later another scheduled search would find the latest source drop and add it to the lookup as ready, then the query would use the latest source to run it's query over, which ultimately is updating an hourly summary index.

The reason for this strangeness with the lookup is that the source events are dropped in irregular batches so I need to test if the source is there before updating the summary index as regular updates would produce loads of duplicates.

Also open to other ways of doing this.

Kind Regards
Peter

Tags (2)
0 Karma

proylea
Contributor

Ok I was able to do this with csv lookup. Here is what the final query looks like

`FIN` [| inputlookup pr-test2.csv | search action=ready | fields - _time,action]
| main query .............
| append
[| inputlookup pr-test2.csv | eval action=if(action="ready","completed","completed") | table _time,action,source | outputlookup pr-test2.csv]
| search action!="completed"

So basically what this does is use the lookups action field as search input, runs the main query that outputs a load of evaluated fields for a summary index, then using the append command I process the lookup and finally remove the appended row after which the summary index is written to.

0 Karma

493669
Super Champion
| inputlookup pr-test2.csv | search action=ready| eval action=completed, UniqueKey=_time
| outputlookup key_field=UniqueKey pr-test2.csv

Here I have Considered _time as unique field and updated action which are in "ready" to "completed"

Hope this will Help...

0 Karma

proylea
Contributor

This is not a kvstore it's only a lookup table so the update doesn't work. Unfortunately I don't have command line access to update collections.conf so I will at this stage need to use csv lookups, which restricts me to overwrite and append only.

0 Karma

proylea
Contributor

The documentation says.
To create a collection, create a collections.conf file in your app's /default or /local directory (for example, $SPLUNK_HOME/etc/apps/yourappname/default/collections.conf), then add a configuration stanza for each collection you want for your app.
Is that not the case? if so can you direct me how to do this?
Also I am running Splunk 6.3 so there may be limitations

0 Karma

493669
Super Champion

I don't understand command line access....do you mean you can not do configuration file changes manually?
if I understand correctly then you can add KVStore from UI which will update collections.conf

0 Karma

proylea
Contributor

Definitely require cmd line access to collections.conf to make kv store work, so I'm back to my original question.

0 Karma

493669
Super Champion

If you go to Settings>Lookups>Lookup definitions>New>
Select Type as "KV Store" And provide Collection Name , Supported Fields
Click Save. Now your KV Store is ready to use.

0 Karma

proylea
Contributor

I appreciate your input.
I tried creating the KV store and got a permissions issue when I tried to write to it with outputlookup, I reproduced it in test also.
In test I then went and created collections.conf and it works fine.
Again I am unable to do this in prod so using csv lookups was my only option unless I want to engage prod support.
Never the less I have posted the working solution which can be done with regular lookups.

0 Karma

493669
Super Champion

To Resolve Permission Issue-
Go to Settings>Lookups>Lookup definitions
Here you will see your created collection name and on write side ( end of row) you will see permission click on that and provide the access to role (which you are assigned) read and write access and click Save.
Now you can write to KVStore.
Yes, you can do with regular lookups as you have posted.

0 Karma

proylea
Contributor

Yes I tried that still the same error with perms.
it's ok all sorted for now, cheers

0 Karma

proylea
Contributor

ah ok, I haven't tried it before.
The docs say you must refer to a collection in the collections.conf, I assumed that meant you had to add one manually into the file.

0 Karma

493669
Super Champion

Yes its better to use KVStore for updation

0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...