All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

You're right! my mistake, I didn't read the entire query. Thanks for pointing out my mistake!
Beautiful. Thank you, this worked and now I understand how to pass the time in when it gets stripped out earlier.
I am new to splunk and  observing the event count and current size showing a 0, even though we can search on the index and have data . Any insights will be helpful.
That is because timechart command requires to have the _time field, and you are removing it with the first stats command. Try this: [My search here] | stats earliest(eval(if(eventType="BEGIN",_time... See more...
That is because timechart command requires to have the _time field, and you are removing it with the first stats command. Try this: [My search here] | stats earliest(eval(if(eventType="BEGIN",_time,""))) AS Begin_time latest(eval(if(eventType="END",_time,""))) AS End_time BY UUID processName | eval ResponseTime=End_time-Begin_time | eval _time = Begin_time | timechart span=10m avg(ResponseTime) by processName
The date_wday is being created with the eval command on the second line... I'll break it down for you. | eval date_hour = strftime(_time, "%H") | eval date_wday = strftime(_time, "%A")  
I dont know if my approach is the right way to go. As I learned, that JOINs allow only 50.000 records to be joined. And I expect way more events to be joined to the filtered transactions.
Hi, I am looking to setup an alert which support to be run every weekday at 7:30PM. Search window for alert query should be from 7PM previous day to 7PM current day. How can I setup this alert. ... See more...
Hi, I am looking to setup an alert which support to be run every weekday at 7:30PM. Search window for alert query should be from 7PM previous day to 7PM current day. How can I setup this alert. Thanks
Thank you everyone for taking the time to ready this. I am new in Splunk and interested in learning more. I have a project at home, and this has to do with viewing authentication traffic on a given n... See more...
Thank you everyone for taking the time to ready this. I am new in Splunk and interested in learning more. I have a project at home, and this has to do with viewing authentication traffic on a given network The challenge I face: I need to view what authentication method is being used to access what resource on the network for a giving index and sourcetype. For example, Windows systems do not have an attribute solo representing if the access to the Nod was SSO or MFA all I get is an event ID 4624. Windows Event ID 4624, successful logon — Dummies guide, 3 minute read (manageengine.com) My understanding is that I have to gather a few attributes and make an educated guess about what access was used. I was hoping to find a one liner lol that will show me what resource is using what authentication method. Any help would be appreciated and virtual drinks on me if we strike gold
I asked in a previous thread for help to get response time based on time differential between two events connected by a UUID (Solved: Re: Measuring time difference between 2 entries - Splunk Communit... See more...
I asked in a previous thread for help to get response time based on time differential between two events connected by a UUID (Solved: Re: Measuring time difference between 2 entries - Splunk Community) which is working perfectly. I turned that into an average response time grouped by a particular transaction type (processName) and thats working fine as well, but I would very much like to use this as a timechart - but I can't seem to get it working. From what I understand, the fact that I am using Stats stripts out the _time which the timechart uses, but I am not sure how to work around that. My query goes as follows: [My search here] | stats earliest(eval(if(eventType="BEGIN",_time,""))) AS Begin_time latest(eval(if(eventType="END",_time,""))) AS End_time BY UUID processName | eval ResponseTime=End_time-Begin_time | stats avg(ResponseTime) by processName I've tried a number of things that didn't work, including changing stats to: | timechart span=10m Avg(ResponseTime) by processName While this did perform a search, it generated no result whatsoever. Won't bore everyone with my multiple failures. My query gives me basically ProcessName Avg(Response_time) Process1 0.5 Process2 0.6 Process3 0.7   My goal is to get this as a time chart visualization with a span of 10 mins. Any suggestions ? Thanks
I think this would work perfectly, but the system does not appear to have date_wday enabled. Using this term always provides me with " no results" 
| stats dc(sender_email) as Sender_email_count by action Is this what you are after? If not, please provide some anonymised sample events and some expected output to clarify your requirement
You could try using eventstats to tag each event with the aggregated value for the transaction it is a part of and use this to filter the events.
Hi, I have a search as below. I want to find count of recipients by action where how many users received the email vs not for every event   index=a sourcetype="a" | bucket span=4h _time | stats... See more...
Hi, I have a search as below. I want to find count of recipients by action where how many users received the email vs not for every event   index=a sourcetype="a" | bucket span=4h _time | stats values(action) as email_action,values(Sender) as Sender,dc(sender_email) as Sender_email_count,values(subject) as subject,dc(URL) as url_count, values(URL) as urls,values(filename) as files,values(recipients_list) as recipients_list by sender_name,_time | search (subject="*RE:*")    Any help would be appreciated.. thank you!
Hi @richgalloway ,Thanks for the reply but may I know what needs to be done here so that data is forwarded to indexer and then search results are obtained.
Coming from SQL, I want to do stuff like GROUP BY and HAVING ... The data is available with a transaction identifier.Grouing should be done by that transaction identifier. Per transaction, I want t... See more...
Coming from SQL, I want to do stuff like GROUP BY and HAVING ... The data is available with a transaction identifier.Grouing should be done by that transaction identifier. Per transaction, I want to check a few attributes, if their values are unique within each treansaction. In SQL terms: select transaction_id from index group by transaction_id having count(distinct attr1) = 1 and count(distinct attr2) = 1 and count(distinct attr3) = 1 From that table of transaction_ids, a join to the same index should be done to filter the events. How can I achieve this with Splunk query?  
_time and now() provide times in epoch format i.e. number of seconds since beginning of 1970. You can calculate the difference between these two numbers e.g. diff = now() - _time. strftime() converts... See more...
_time and now() provide times in epoch format i.e. number of seconds since beginning of 1970. You can calculate the difference between these two numbers e.g. diff = now() - _time. strftime() converts epoch times to strings, you can't find the difference in time by subtracting one string from another, they are the wrong data type for numerical operations!
Any solution? Same error for me
I can not access to Security Content dashboard because I recieve this message:  
I have been trying to get the following sourcetype into Splunk for PI.  This whole stanza should go in as 1 event, but I've been unable to get the breakdown to multiple events from happening: { "Pa... See more...
I have been trying to get the following sourcetype into Splunk for PI.  This whole stanza should go in as 1 event, but I've been unable to get the breakdown to multiple events from happening: { "Parameters": null, "ID": 2185, "TimeStamp": "\/Date(1718196855107)\/", "Message": "User query failed: Connection ID: 55, User: xxxxx, User ID: 1, Point ID: 247000, Type: summary, Start: 12-Jun-24 08:52:45, End: 12-Jun-24 08:54:15, Mode: 5, Status: [-11059] No Good Data For Calculation", "ProgramName": "sssssss", "Category": null, "OriginatingHost": null, "OriginatingOSUser": null, "OriginatingPIUser": null, "ProcessID": 5300, "Priority": 10, "ProcessHost": null, "ProcessOSUser": "SYSTEM", "ProcessPIUser": null, "Source1": "piarcset", "Source2": "Historical", "Source3": null, "SplunkTime": "1718196855.10703", "Severity": "Warning" }, I have even tried using the _json defaulted with Splunk, but it keeps breaking it into multiple lines/events.  Any suggestions would be helpful.