Deployment Architecture

Add a static field during indexing

xbbj3nj
Path Finder

Hi All,,

Want to add a static field called SLAtime for each filename pattern range which is present in a existing filed called FileName.
Example below,

FileName
filename1.log
filename2.log

For these pattern want to add corresponding SLA field below while indexing.

SLATime
10:30 AM
11:00 PM

Could you help me to do these without input lookups ?
Can this be done via field lookups ?

Thanks,
Prem

Tags (2)
0 Karma
1 Solution

jeffland
Champion

Sure that's possible. But are you sure you want to have this field as an indexed field? I would suggest you use a search time field extraction, but feel free to ask again if you really need the indexed extraction.

For a search time field, you can either use a calculated field or add a field extraction. A calculated field could be done like this:

eval SLATime=mvindex(split(FileName, "."), 0)

provided there are no . in the file name before the one delimiting the file extension (.log). Simply add the part after eval as a calculated field.
For a field extraction, you could use a transform stanza to extract the file name (without the extension) from the FileName field with a regex.

View solution in original post

0 Karma

jeffland
Champion

Sure that's possible. But are you sure you want to have this field as an indexed field? I would suggest you use a search time field extraction, but feel free to ask again if you really need the indexed extraction.

For a search time field, you can either use a calculated field or add a field extraction. A calculated field could be done like this:

eval SLATime=mvindex(split(FileName, "."), 0)

provided there are no . in the file name before the one delimiting the file extension (.log). Simply add the part after eval as a calculated field.
For a field extraction, you could use a transform stanza to extract the file name (without the extension) from the FileName field with a regex.

0 Karma

xbbj3nj
Path Finder

Jeff - Thanks for your answer.
To just give you a background, there are 100's of file names and can follow n number of patterns like .log,,.csv,.xls,etc. so I can't go with the log format type, rather I know the list of files i need to look.

SO I need to assign a static SLAtime field for those patterns i know.

FileName=(star)filename1(star)
FileName=(star)filename2(star)

At both end of the filename there can be dynamic text keep on vary daily , like date or time appended to it.

Which is the best way to go for this requirement ?

0 Karma

jeffland
Champion

If you want to make the field dependent of the value of the FileName field, then I'd suggest going with a field extraction based on a transform stanza. That way, you can define a regex which only captures a value for SLATime if the regex matches, and SLATime won't contain data if it doesn't.
Assuming that's what you want to do, you need to create the extraction in transforms.conf like this:

[some_name]
REGEX = (?<SLATime>[\S]+\s[APM]+)
SOURCE_KEY = FileName

where your would need to adjust the REGEX to capture your other desired values as well (this one will only capture SLATime in the format you described in your question).
You will then need to define a stanza like this in props.conf to apply the above extraction to a sourcetype/source/host:

[some_sourcetype or host::some_host or source::some_source]
REPORT-something = some_name

Check the docs for all that here. You can of course also do the same via the Web Interface, the docs for that are here.

0 Karma

xbbj3nj
Path Finder

Ok , got what you say here.
So I need to use transforms.conf. Could you try for this below ?

Set 1 :
Pattern in logs : (star)BWA_SHADOW_EXTRACT_MC_CCP_VM_ASOF(star)
Actual name : 2015_BWA_SHADOW_EXTRACT_MC_CCP_VM_ASOF_Nov5
Create fields like hourSla - 7 MinuteSLA-30 [because SLA time is 7:30 AM, want to create these fields to compare it later with actual transmssion time]

Set2 :
Pattern in logs:(star)Source_To_Source_Exception_Report(star)
Actual name : 2015_Source_To_Source_Exception_Report_Nov5
Create fields like hourSla - 9 Minute SLA - 45 [because SLA time is 9:45 AM, want to create these fields to compare it later with actual transmssion time]

Background is im trying to build a SLA tracking dashboard which compares the actual transmission time and SLA time and display it as "SLA Missed" OR "SLA MET"

Filename SLAtime Actualtime SLAstatus
2015_BWA_SHADOW_EXTRACT_MC_CCP_VM_ASOF_Nov5 7:30 AM 7:32 AM MISSED
2015_Source_To_Source_Exception_Report_Nov5 9:45 AM 9:00 AM MET

0 Karma

jeffland
Champion

There is no info on the time (7:30) within those names. To extract the date, you could use this regex

(?<year_sla>\d{4})_BWA_SHADOW_EXTRACT_MC_CCP_VM_ASOF_(?<month_sla>\w{3})(?<day_sla>\d+)

to extract to individual fields for the first set and

(?<year_sla>\d{4})_Source_To_Source_Exception_Report_(?<month_sla>\w{3})(?<day_sla>\d+)

for the second - see the idea?

0 Karma

xbbj3nj
Path Finder

No I don't,

Apologize for the filename.

Please consider this ,

abcdeeeee*BWA_SHADOW_EXTRACT_MC_CCP_VM_ASOFabcdeeee
abcdeeeee
Source_To_Source_Exception_Report*abcdeeee

I only know these highlighted pattern and I'd want to assign static SLA time now as individual fields , hourSLA and minuteSLA for each of them.

0 Karma

jeffland
Champion

Oh, so you want to capture whatever is before BWA or Source? Then do it like this:

(?<stuff_before>.+?)(?:BWA_SHADOW_EXTRACT_MC_CCP_VM_ASOF|Source_To_Source_Exception_Report)(?<stuff_after>.+)

You'd obviously have to rename the capturing groups to something appropriate (I can't see whether hourSLA or minuteSLA comes before the middle part).

By the way, you want to check out a tool such as regex101.com to help you with all your regex needs.

0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...