The overall difficulty of this whole exercise will depend on your logstash configuration and the use case - if you have just one sourcetype to ingest - maybe you can do it relatively reasonably. But ...
See more...
The overall difficulty of this whole exercise will depend on your logstash configuration and the use case - if you have just one sourcetype to ingest - maybe you can do it relatively reasonably. But if you want to send multiple sourcetypes over a single connection, that can be tricky to separate on the receiving side. You could send multiple sourcetypes using multiple tokens so they are received into separate indexes/with separate sourcetypes but it's getting complicated and - as I said before - needs proper configuration on the logstash side. Anyway - it's still up to logstash to filter events before sending.
Try doubling up the $'s as single $ are for tokens in dashboards. index=hello sourcetype=welcome
| stats max(DATETIME) as LatestTime
| map search="search index=hello sourcetype=welcome DATETIME=$$La...
See more...
Try doubling up the $'s as single $ are for tokens in dashboards. index=hello sourcetype=welcome
| stats max(DATETIME) as LatestTime
| map search="search index=hello sourcetype=welcome DATETIME=$$LatestTime$$"
| stats sum(HOUSE_TRADE_COUNT) as HOUSE_Trade_Count
Assuming you already have the fields extracted: <your index search>
| stats count by Name Version host
| eventstats count by Name Version
| eventstats max(count) as top
| where count=top
Hi,
if i run this query in simple search bar it works fine. However, when i create panel and add the below, i'm getting error as waiting for input.
Please could you advise?
index=hello sourcety...
See more...
Hi,
if i run this query in simple search bar it works fine. However, when i create panel and add the below, i'm getting error as waiting for input.
Please could you advise?
index=hello sourcetype=welcome
| stats max(DATETIME) as LatestTime
| map search="search index=hello sourcetype=welcome DATETIME=$LatestTime$"
| stats sum(HOUSE_TRADE_COUNT) as HOUSE_Trade_Count
Thanks,
selvam.
I am searching for "Unified Payment Platform Version=" which contains the specific version of firmware from about 2000+ hosts. The line I am searching may populate multiple times depending on if th...
See more...
I am searching for "Unified Payment Platform Version=" which contains the specific version of firmware from about 2000+ hosts. The line I am searching may populate multiple times depending on if the device was rebooted. The search I need: - list all the versions, but only one count from each host - if possible, the list the hosts on the version
Hi All, I have an output from a lookup table in splunk where the team work timings field is coming as:: TeamWorkTimings 09:00:00-18:00:00 I want the output to be separated in two fields, like: T...
See more...
Hi All, I have an output from a lookup table in splunk where the team work timings field is coming as:: TeamWorkTimings 09:00:00-18:00:00 I want the output to be separated in two fields, like: TeamStart TeamEnd 09:00:00 18:00:00 Please help me in getting this output in splunk
Hello, I've below dataset from Splunk search. Name percentage A 71% B 90% C 44% D 88% E 78% All I need to change the percentage field values color as per below rule i...
See more...
Hello, I've below dataset from Splunk search. Name percentage A 71% B 90% C 44% D 88% E 78% All I need to change the percentage field values color as per below rule in the email alert. My requirement to achieve this by updating the sendemail.py. 95+ green, 80-94 amber, <80 = red @tscroggins @ITWhisperer @yuanliu @bowesmana
Try something like this | eval {Function}_TIME=_time
| stats values(Date_of_reception) as Date_of_reception values(*_TIME) as *_TIME by JOBNAME
| eval Diff=ENDED_TIME-STARTED_TIME
| fieldformat STAR...
See more...
Try something like this | eval {Function}_TIME=_time
| stats values(Date_of_reception) as Date_of_reception values(*_TIME) as *_TIME by JOBNAME
| eval Diff=ENDED_TIME-STARTED_TIME
| fieldformat STARTED_TIME=strftime(STARTED_TIME,"%H:%M:%S")
| fieldformat ENDED_TIME=strftime(ENDED_TIME,"%H:%M:%S")
| fieldformat PURGED_TIME=strftime(PURGED_TIME,"%H:%M:%S")
| fieldformat Diff=tostring(Diff,"duration")
Hi, I am getting Axios 500 errors after installing the Salesforce Streaming API add-on app on my Splunk Cloud Trial (Classic). I can't configure the Inputs or Configuration tabs at all. I have a feel...
See more...
Hi, I am getting Axios 500 errors after installing the Salesforce Streaming API add-on app on my Splunk Cloud Trial (Classic). I can't configure the Inputs or Configuration tabs at all. I have a feeling that this add-on isn't properly supported in the Trial Cloud instances. Has anyone had any luck getting this to work on Cloud Classic? Am I missing an additional configuration or app that I need to install to get this to work? Any help would be greatly appreciated. P.S.: I was able to get the Salesforce add-on to install, configure, and connect to my Sandbox just fine. It is this streaming api add-on that seems to be an issue.
This give me the result in the below format. is it possible to have 1 more field in the table and sort the columns in the below order: | JOBNAME | Date_of_reception | STARTED_TIME | EN...
See more...
This give me the result in the below format. is it possible to have 1 more field in the table and sort the columns in the below order: | JOBNAME | Date_of_reception | STARTED_TIME | ENDED_TIME | PURGED_TIME| Diff Between STARTED_TIME and ENDED_TIME | | $VVF119P | 2024/04/17 | 02:12:37 | 02:12:46 | 02:12:50 | 00:00:09|