Hello Splunk Users,
This is the issue I am trying to solve in Splunk. I have logs that are continuously uploaded to Splunk (24/7) properly and there is one specific field of interest - Release Version which I need to capture & store in Splunk for every unit under test by Serial Number. I would like to incorporate a time stamp also when this Release Version was stored.
Example:
Serial Number 111 Release Version 1.0 TimeStamp 01/13/16 3pm
The Release Version will change on each of these units randomly in time but that will definitely be captured in the logs. So tomorrow if this unit has a different Release it should show like below.
Example:
Serial Number 111 Release Version 1.0 TimeStamp 01/13/16 3pm
Serial Number 111 Release Version 1.1 TimeStamp 01/14/16 2pm
This way the history of the Release Version for each of the units is also captured. Please help me in getting this implemented in any way possible.
Please let me know if any additional information is required or there are any questions.
Thanks a lot,
Abhi
Currently, I have it working in a manual way in which I have created a csv file as a Lookup table and have a Lookup definition which gives me the Release Version for the Serial Number I am trying to display the data in the search. I need to update the CSV file whenever there is a change in the Release Version for the 100+ units which is very frequent.
This is how you can set that up. Assuming you've field extraction setup for Serial Number and Release Version. Lets say its Serial_Number and Release_Number. Also, assume your lookup file name is serial_to_release_lookup.csv which has the fields with same name
Your base search | stats count by Serial_Number Release_Number | append [| inputlookup serial_to_release_lookup.csv ] | | stats count by Serial_Number Release_Number | table Serial_Number Release_Number| outputlookup serial_to_release_lookup.csv
Saved this search (in the same app where the lookup is available. if lookup is global, then it doesn't matter) to run at a frequency of your choice with a definite timerange to avoid overlap between two execution. This will get the Serial_Number Release_Number mapping from logs for selected timerange, merge with existing values in the lookup and write it back to the lookup. This way the lookup will have all latest and historical values for Serial_Number Release_Number.
Updated answer
index=Firmware_QA sourcetype=SecureCRT_log FW_Version=*| dedup SNum sortby FW_Version | table SNum FW_Version _time | append [| inputlookup FW_Version_SecureCRT.csv] | stats values(_time) as _time by SNum,FW_Version | eval _time=mvindex(_time,-1) | outputlookup FW_Version_SecureCRT.csv
This is how you can set that up. Assuming you've field extraction setup for Serial Number and Release Version. Lets say its Serial_Number and Release_Number. Also, assume your lookup file name is serial_to_release_lookup.csv which has the fields with same name
Your base search | stats count by Serial_Number Release_Number | append [| inputlookup serial_to_release_lookup.csv ] | | stats count by Serial_Number Release_Number | table Serial_Number Release_Number| outputlookup serial_to_release_lookup.csv
Saved this search (in the same app where the lookup is available. if lookup is global, then it doesn't matter) to run at a frequency of your choice with a definite timerange to avoid overlap between two execution. This will get the Serial_Number Release_Number mapping from logs for selected timerange, merge with existing values in the lookup and write it back to the lookup. This way the lookup will have all latest and historical values for Serial_Number Release_Number.
Updated answer
index=Firmware_QA sourcetype=SecureCRT_log FW_Version=*| dedup SNum sortby FW_Version | table SNum FW_Version _time | append [| inputlookup FW_Version_SecureCRT.csv] | stats values(_time) as _time by SNum,FW_Version | eval _time=mvindex(_time,-1) | outputlookup FW_Version_SecureCRT.csv
This worked perfectly the way I wanted. Thanks a lot somesoni2.
Abhi
Thanks somesoni2. I am trying out this recommendation in our environment.
I will circle back once it works or with some additional questions.
I am taking steps to have the above implemented in our environment. I have the field extractions in place and they work fine for SNum (Serial Number) and FW_Version (Firmware Version). I am using '_time' to keep track of the last known FW_Version seen in the log.
I am enclosing my search string
index=Firmware_QA sourcetype=SecureCRT_log | dedup SNum sortby FW_Version | table SNum FW_Version _time | append [outputlookup FW_Version_SecureCRT.csv]
I am using the 'dedup' to get only unique values in the table for SNum (Serial Number) which can be tied to FW_Version (Firmware Version).
I only want to append to the FW_Version_SecureCRT.csv only if - a) there is a valid FW_Version seen in the table (most of the times it is empty as the log does not have a FW_Version in the selected time range) b) if there is a value, confirm if this value is different from the one already in the csv file for that SNum
Thanks a lot for all the help.
If you can generate a search to produce the results you want then you just have to add | outputlookup filename.csv
to the end. Make it a scheduled search to make it automatic.