All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Aren't you overcomplicating it a bit? Just render the date to a field | eval day=strftime(_time,"%F") and you're ready to go | stats min(_time) as earliest max(_time) as latest by day
 Yes, I understand it’s the copy part, you want, unless the system logs it its going to be tricky.  This may be of help not sure, so now I remember many years ago, I helped a company, to monitor fo... See more...
 Yes, I understand it’s the copy part, you want, unless the system logs it its going to be tricky.  This may be of help not sure, so now I remember many years ago, I helped a company, to monitor for when their important/sensitive files in a Windows Server were being accessed/copied or deleted.   We got the Windows Administrator to enable security/auditing for the important files etc (read/write/delete attributes) etc.  We then tested for files being copied/deleted/accessed, the event ID's were something like below, and we sent these to Splunk and was able to monitor them for insider threats. (This was only done for the most important files as they could generate lots of events)  If you enable these they may contain more info as to the device as well, cant remember, but worth a go.  Event ID 4656: A handle to an object was requested. Event ID 4663: An attempt was made to access an object. Event ID 4660: An object was deleted.
Thank you this one working for me  
Lastly I will figure out how to organize this by Day in desc order; right now it is sorting the results by another column...  Yes, Splunk has some weird obsession with alphabetic/ASCII ordering ... See more...
Lastly I will figure out how to organize this by Day in desc order; right now it is sorting the results by another column...  Yes, Splunk has some weird obsession with alphabetic/ASCII ordering unless you tell it otherwise. (It kind of surprises me when the "natural" sorting order is already set in groupby (numeric on original value of "day") but Splunk changes it after adding character string "day -".)  All you need to do is to insert sort after stats and before that string conversion. When you say in desc order, I imagine that you want the reverse numeric order.  Is this correct? index=*app_pcf cf_app_name="mddr-batch-integration-flow" "posbatch04" earliest=-14d@d latest=-0d@d | eval dayback = mvrange(0, 14) | eval day = mvmap(dayback, if(_time < relative_time(now(), "-" . dayback . "d@day") AND relative_time(now(), "-" . tostring(dayback + 1) . "d@day") < _time, dayback, null())) | stats min(_time) as Earliest max(_time) as Latest by day | sort - day | fieldformat Earliest = strftime(Earliest, "%F %T") | fieldformat Latest = strftime(Latest, "%F %T") | eval day = "day -" . tostring(day + 1) | eval Elapsed_Time=Latest-Earliest, Start_Time_Std=strftime(Earliest,"%H:%M:%S:%Y-%m-%d"), End_Time_Std=strftime(Latest,"%H:%M:%S:%Y-%m-%d") | eval Elapsed_Time=Elapsed_Time/60 If you want to last day first, just  | sort day.
Hi @cbiraris , I suppose that these three values are in a field (e.g. type), so you can run a search like the folowing: index= abc sourcetype=xyz type IN ("warning", "Error", Critical) | stats ... See more...
Hi @cbiraris , I suppose that these three values are in a field (e.g. type), so you can run a search like the folowing: index= abc sourcetype=xyz type IN ("warning", "Error", Critical) | stats values(eval(type="warning")) AS warning_count values(eval(type="Error")) AS Error_count values(eval(type="Critical")) AS Critical_count | where warning_count>5 OR Error_count>5 OR Critical_count>5 you can aso setup a different threshold for each type of message. If you don't have the three values in a fied, you have to use a similar search: index= abc sourcetype=xyz ("warning" OR "Error" OR Critical) | stats values(eval(searchmatch("warning"))) AS warning_count values(eval(searchmatch("Error"))) AS Error_count values(eval(searchmatch("Critical"))) AS Critical_count | where warning_count>5 OR Error_count>5 OR Critical_count>5 Ciao. Giuseppe
Yes, only if any of them individually occur 5 times e.g. 5 warnings or 5 errors or 5 criticals
Yes, only if any of them individually occur 5 times e.g. 5 warnings or 5 errors or 5 criticals  
Thanks, yes I'm aware of those. We have media connected to our clients for various reasons. I'm specifically looking for an alert that data is being burned onto removeable media. 
Please clarify your requirement - do you want the alert to trigger if any of the values occurs 5 times e.g. 2 warnings, 2 errors and 1 critical, or only if any of them individually occur 5 times e.g.... See more...
Please clarify your requirement - do you want the alert to trigger if any of the values occurs 5 times e.g. 2 warnings, 2 errors and 1 critical, or only if any of them individually occur 5 times e.g. 5 warnings or 5 errors or 5 criticals?
Hi team, I need help to create a query with with 3 different threshold for 3 different event in single splunk alert. for example : index= abc sourcetype=xyz "warning" OR "Error" OR Criti... See more...
Hi team, I need help to create a query with with 3 different threshold for 3 different event in single splunk alert. for example : index= abc sourcetype=xyz "warning" OR "Error" OR Critical If any of these ("warning" OR "Error" OR Critical) occurred 5 times in events in last 15 minutes alert should be triggered .  
Hi @Harish2 , the most efficient solution is using a lookup, if you aren0t enabled to do this, you have to insert in the input all the conditions, somethin ike this: <form version="1.1" theme="ligh... See more...
Hi @Harish2 , the most efficient solution is using a lookup, if you aren0t enabled to do this, you have to insert in the input all the conditions, somethin ike this: <form version="1.1" theme="light"> <label>Dashboard</label> <fieldset submitButton="false"> <input type="time" token="timepicker"> <label>TimeRange</label> <default> <earliest>-15m@m</earliest> <latest>now</latest> </default> </input> <input type="dropdown" token="env"> <label>Environment</label> <choice value="*">All</choice> <prefix>env="</prefix> <suffix>"</suffix> <default>*</default> <fieldForLabel>env</fieldForLabel> <fieldForValue>env</fieldForValue> <search> <query> | makeresults | eval env="DEV" | fields env | append [ | makeresults | eval env="SIT" | fields env ] | append [ | makeresults | eval env="UAT" | fields env ] | append [ | makeresults | eval env="SYS" | fields env ] | sort env | table env </query> </search> </input> <input type="dropdown" token="server"> <label>Server</label> <choice value="*">All</choice> <prefix>server="</prefix> <suffix>"</suffix> <default>*</default> <fieldForLabel>server</fieldForLabel> <fieldForValue>server</fieldForValue> <search> <query> | makeresults | eval env="DEV", server="amptams.dev.com" | fields env server | append [ | makeresults | eval env="DEV", server="ampvitss.dev.com" | fields env server ] | append [ | makeresults | eval env="DEV", server="ampdoctrc.dev.com" | fields env server ] | append [ | makeresults | eval env="SIT", server="ampastdmsg.dev.com" | fields env server ] | append [ | makeresults | eval env="SIT", server="ampmorce.dev.com" | fields env server ] | append [ | makeresults | eval env="SIT", server="ampsmls.dev.com" | fields env server ] | append [ | makeresults | eval env="UAT", server="ampserv.dev.com" | fields env server ] | append [ | makeresults | eval env="UAT", server="ampasoomsg.dev.com" | fields env server ] | append [ | makeresults | eval env="SYS", server="ampmsdser.dev.com" | fields env server ] | append [ | makeresults | eval env="SYS", server="ampastcol.dev.com" | fields env server ] | search $env$ | dedup server | sort server | table server </query> </search> </input> </fieldset> <row> <panel> <table> <title>Incoming Count &amp; Total Count</title> <search> <query> index=app-index source=application.logs $env$ $server$ ( "Initial message received with below details" OR "Letter published correctley to ATM subject" OR "Letter published correctley to DMM subject" OR "Letter rejected due to: DOUBLE_KEY" OR "Letter rejected due to: UNVALID_LOG" OR "Letter rejected due to: UNVALID_DATA_APP" ) | rex field= _raw "application :\s(?<Application>\w+)" | rex field= _raw "(?<Msgs>Initial message received with below details|Letter published correctley to ATM subject|Letter published correctley to DMM subject|Letter rejected due to: DOUBLE_KEY|Letter rejected due to: UNVALID_LOG|Letter rejected due to: UNVALID_DATA_APP)" | chart count over Application by Msgs | rename "Initial message received with below details" AS Income, "Letter published correctley to ATM subject" AS ATM, "Letter published correctley to DMM subject" AS DMM, "Letter rejected due to: DOUBLE_KEY" AS Reject, "Letter rejected due to: UNVALID_LOG" AS Rej_log, "Letter rejected due to: UNVALID_DATA_APP" AS Rej_app | table Income Rej_app ATM DMM Reject Rej_log Rej_app </query> <earliest>timepicker.earliest</earliest> <latest>timepicker.latest</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentageRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> <form> Ciao. Giuseppe
from hackthebox academy it's one of their questions and the hint is to use the range() function
Maybe you need to have these enabled, work with your Windows Admin to get them enabled or check for you. Or simple insert a drives and look for the logs. Here are some common event IDs that I was abl... See more...
Maybe you need to have these enabled, work with your Windows Admin to get them enabled or check for you. Or simple insert a drives and look for the logs. Here are some common event IDs that I was able to find using Google! Some may be logged others not when drives are added etc, but you will have to work with your Windows Admin to find out if they have been enabled or are logging etc or use sysmon which is a tool to also monitor and create events. I have listed the TA's that you will need to use once you have all the logs in place. Event ID 12: This event is logged when a removable media is inserted into the computer. Event ID 106: This event is logged when a new external storage device is connected to the system. Event ID 20001: This event is logged by Microsoft-Windows-DriverFrameworks-UserMode when a new device is connected to the system. Event ID 20003: This event is logged by Microsoft-Windows-DriverFrameworks-UserMode when a device is removed from the system. Event ID 1001: This event is logged when a device is enumerated by the Plug and Play manager. Event ID 1003: This event is logged when a device is started. Event ID 1004: This event is logged when a device is stopped Splunk TA's Windows Logs https://splunkbase.splunk.com/app/742 Sysmon logs / info https://splunkbase.splunk.com/app/5709
@gcusello , Sorry if i have confused you. If we have 2 hosts for each env like shown below apart from lookup file do we have other solutions to create drop down env wise. dev 2 hosts sit 2 hosts... See more...
@gcusello , Sorry if i have confused you. If we have 2 hosts for each env like shown below apart from lookup file do we have other solutions to create drop down env wise. dev 2 hosts sit 2 hosts sys 2 hosts uat 2 hosts
Thanks for clarifying. I have added the new Ubuntu servers to the cluster and will initiate the data rebalance today.  Doing some reading on removing a peer and it looks like it's as simple as runni... See more...
Thanks for clarifying. I have added the new Ubuntu servers to the cluster and will initiate the data rebalance today.  Doing some reading on removing a peer and it looks like it's as simple as running the offline command on the peer, correct? Assuming the cluster is healthy, just run this wait for it to finish, and the server can be shut down.    splunk offline --enforce-counts  
Thanks @yuanliu for your quick response. I am totally unaware how to achieve this by creating a custom command.
Before I tried at end my query as  |fillnull value=0  which did not work  now I tried  fillnull App1 App2 App2 App4 its working 
Thanks for your answer, I guess I need to provide more detail. This is a windows 11 client and Windows Server 2012 System. I was not able to find an event ID for this activity in event manager.  
Splunk will report on the data it has, so you have to first identify which logs or other data sources contain the data that shows data being copied to a removable media - so this is something you hav... See more...
Splunk will report on the data it has, so you have to first identify which logs or other data sources contain the data that shows data being copied to a removable media - so this is something you have to find out based on your systems. Once you have this knowledge of where the data is you will need to ingest the data source into Splunk, extract fields and use them to report via Splunk.
Thank you @yuanliu ! This worked really well.  I added my eval commands to it as well and was able to produce the table that I was seeking, with your great query as a guide.   I've expanded the time ... See more...
Thank you @yuanliu ! This worked really well.  I added my eval commands to it as well and was able to produce the table that I was seeking, with your great query as a guide.   I've expanded the time range to 14 days bc I realized 7 days was a little pointless since most of my batches only run M-F.  My final query ended up being:  index=*app_pcf cf_app_name="mddr-batch-integration-flow" "posbatch04" earliest=-14d@d latest=-0d@d | eval dayback = mvrange(0, 14) | eval day = mvmap(dayback, if(_time < relative_time(now(), "-" . dayback . "d@day") AND relative_time(now(), "-" . tostring(dayback + 1) . "d@day") < _time, dayback, null())) | stats min(_time) as Earliest max(_time) as Latest by day | fieldformat Earliest = strftime(Earliest, "%F %T") | fieldformat Latest = strftime(Latest, "%F %T") | eval day = "day -" . tostring(day + 1) | eval Elapsed_Time=Latest-Earliest, Start_Time_Std=strftime(Earliest,"%H:%M:%S:%Y-%m-%d"), End_Time_Std=strftime(Latest,"%H:%M:%S:%Y-%m-%d") | eval Elapsed_Time=Elapsed_Time/60 Lastly I will figure out how to organize this by Day in desc order; right now it is sorting the results by another column...  Much appreciated for the help and the fast response, I would have never figured this out