Its working now
I have used below query. Thanks all for the help 🙂
source="incident (1).csv" host="instance-3" index="incident" sourcetype="csv"
| rex max_match=0 field=_raw "(?<classification>(?i)(order))" | eval classification=lower(classification)| stats count by classification
... View more
I understood that you are trying to get total bandwidth utilization for 1 month.
Query
index=Proxy site="XXX"
| eval IO_bytes= (bytes_in+bytes_out)/1024
| eval Bytes=(bytes/1024)
| eval Total_bytes= if(IO_bytes=Bytes, Bytes,Total_bytes)
| table Bytes Total_bytes
... View more
thats right. so ideally the combination of first two lines should define the transaction start. and thelast line with same terminal number should be used as transaction end. after which i want to calculate the duration.
... View more
You can specify the number of lines or an ending keyword for the transaction and it will fetch everything in between. Something like | transaction JSESSIONID clientip startswith="view" endswith="purchase" from here https://docs.splunk.com/Documentation/Splunk/7.2.6/SearchReference/Transaction. You can cheat and use the sourcetype as a common field.
... View more
You can use the IN operator like this:
index=x (source= IN("a", "b", "c") AND (destination IN("1", "2," 3")
You could also exploit 2 lookup file that have these lines:
source
a
b
c
d
Like this:
index=x AND [|inputlookup source.csv | table source] AND [|inputlookup destination.csv | table destination]
You could also use a macro.
... View more
I just wanted to let everyone know how I figured this out.
The trick to getting this to work was using the FIELD EXTRACTOR and entering in my own custom REGULAR EXPRESSION.
JSON format can be tricky, it omits certain special characters. So I needed to view the event as "raw text" in order to see the extra backslashes that need to be accounted for.
The Regular Expression that I typed into the Field Extractor is below:
\\"valid_secs\\":\s(?\d+)
This Regular Expression was able to successfully extract the integer value that came after the words "valid_secs" and store it into its own field which I named 'valid_secs'.
Once this new field was extracted, I was able to type in the below search to get all events in which the field "valid_secs" was greater than the value 14400:
index=duo | search valid_secs>14400
I saved that search as an alert and I now get an alert every time that event is triggered.
Thank you to anyone who took the time to help and I hope this helps.
... View more
Not real sure which part you are trying to capture, but here is what I came up with:
| rex "java\.util\.concurrent\.Exception\:\s(?<CODE>.[^\;]+)\;\sException\:(?<EXCEPTION>.[^\:]+)\:\s\{\s(?<MSG>.[^\}]+)"
Let me know if you need more.
... View more
🙂 Okay, I think I have it:
| rex "Name\:\s(?<NAME>.[^\n]+)\nSummary\:\s(?<SUM>.[^\n]+)\nDescription\:\s(?<DES>(?:\s|.)*?(?=Category))Category\:\s(?<CAT>.[^\n]+)\nState\:\s(?<STATE>.[^\n]+)\nPublisher\:\s(?<PUB>.[^\n]+)\nVersion\:\s(?<VER>.[^\n]+)\nBuild\sRelease\:(?<BUILD>.[^\n]+)\nBranch\:\s(?<BRANCH>.[^\n]+)\nPackaging\sDate\:(?<PDATE>.[^\n]+)\nSize\:(?<SIZE>.[^\n]+)\nFMRI\:\s(?<FMRI>.[^\n]+)"
Let me know if there are issues.
... View more
Ah, I would change the search time to be only last 60 minutes or few hours. Like you are seeing, since you are looking back 24 hours it is going to return any other alerts triggered in the last 24 hours.
... View more
I am pulling SCCM logs like this:
[default]
host = ABCSCCM01
[monitor://E:\Microsoft Configuration Manager\Logs\*.log]
sourcetype = sccm_log_raw
index = sccm
And am also querying v_FullCollectionMembership and v_AssignmentState_Combined via DBConnect (for us it's just MS-SQL Server Using jTDS Driver) but there are obviously way more tables you can pull from.
... View more
agree to @elliotproebstel, your base search should have statistically aggregated data which should then be passed on to other post-process searches. If you need to use raw data from base search you might be better off running same search twice rather than using post-processing. Refer to documentation for best practice: http://docs.splunk.com/Documentation/Splunk/latest/Viz/Savedsearches#Best_practices
Also check out Examples provided in above documentation which tells recursive post processing and complex statistical data to be passed on from base search to post-process search.
Refer to one of my recent answers to use Post Processing to show plot Timechart and Pie Chart: https://answers.splunk.com/answers/637178/how-to-generate-a-pie-chart.html
... View more
Unfortunately the fault is on the Cisco side. I need to modify the logging level which I will tomorrow. Turns oul level 6 and 7 events are not being forwarded to splunk
... View more
@Harold9000, seems like you are hitting Subsearch limit which is actually 50K by default for JOIN. Refer to documentation: http://docs.splunk.com/Documentation/Splunk/7.0.3/Search/Aboutsubsearches#Output_settings_for_subsearch_commands
Since you seem to be running kind of selfjoin. Would it be possible to provide some sample data mock/anonymize any sensitive information and provide more details of what you expect?
... View more