All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I think this would work perfectly, but the system does not appear to have date_wday enabled. Using this term always provides me with " no results" 
| stats dc(sender_email) as Sender_email_count by action Is this what you are after? If not, please provide some anonymised sample events and some expected output to clarify your requirement
You could try using eventstats to tag each event with the aggregated value for the transaction it is a part of and use this to filter the events.
Hi, I have a search as below. I want to find count of recipients by action where how many users received the email vs not for every event   index=a sourcetype="a" | bucket span=4h _time | stats... See more...
Hi, I have a search as below. I want to find count of recipients by action where how many users received the email vs not for every event   index=a sourcetype="a" | bucket span=4h _time | stats values(action) as email_action,values(Sender) as Sender,dc(sender_email) as Sender_email_count,values(subject) as subject,dc(URL) as url_count, values(URL) as urls,values(filename) as files,values(recipients_list) as recipients_list by sender_name,_time | search (subject="*RE:*")    Any help would be appreciated.. thank you!
Hi @richgalloway ,Thanks for the reply but may I know what needs to be done here so that data is forwarded to indexer and then search results are obtained.
Coming from SQL, I want to do stuff like GROUP BY and HAVING ... The data is available with a transaction identifier.Grouing should be done by that transaction identifier. Per transaction, I want t... See more...
Coming from SQL, I want to do stuff like GROUP BY and HAVING ... The data is available with a transaction identifier.Grouing should be done by that transaction identifier. Per transaction, I want to check a few attributes, if their values are unique within each treansaction. In SQL terms: select transaction_id from index group by transaction_id having count(distinct attr1) = 1 and count(distinct attr2) = 1 and count(distinct attr3) = 1 From that table of transaction_ids, a join to the same index should be done to filter the events. How can I achieve this with Splunk query?  
_time and now() provide times in epoch format i.e. number of seconds since beginning of 1970. You can calculate the difference between these two numbers e.g. diff = now() - _time. strftime() converts... See more...
_time and now() provide times in epoch format i.e. number of seconds since beginning of 1970. You can calculate the difference between these two numbers e.g. diff = now() - _time. strftime() converts epoch times to strings, you can't find the difference in time by subtracting one string from another, they are the wrong data type for numerical operations!
Any solution? Same error for me
I can not access to Security Content dashboard because I recieve this message:  
I have been trying to get the following sourcetype into Splunk for PI.  This whole stanza should go in as 1 event, but I've been unable to get the breakdown to multiple events from happening: { "Pa... See more...
I have been trying to get the following sourcetype into Splunk for PI.  This whole stanza should go in as 1 event, but I've been unable to get the breakdown to multiple events from happening: { "Parameters": null, "ID": 2185, "TimeStamp": "\/Date(1718196855107)\/", "Message": "User query failed: Connection ID: 55, User: xxxxx, User ID: 1, Point ID: 247000, Type: summary, Start: 12-Jun-24 08:52:45, End: 12-Jun-24 08:54:15, Mode: 5, Status: [-11059] No Good Data For Calculation", "ProgramName": "sssssss", "Category": null, "OriginatingHost": null, "OriginatingOSUser": null, "OriginatingPIUser": null, "ProcessID": 5300, "Priority": 10, "ProcessHost": null, "ProcessOSUser": "SYSTEM", "ProcessPIUser": null, "Source1": "piarcset", "Source2": "Historical", "Source3": null, "SplunkTime": "1718196855.10703", "Severity": "Warning" }, I have even tried using the _json defaulted with Splunk, but it keeps breaking it into multiple lines/events.  Any suggestions would be helpful.  
How to find difference of the time in days and hours respectively between Event time of the data and current time? Format of the Time i.e _time is below 6/18/24 10:17:15.000 AM I tried utiliz... See more...
How to find difference of the time in days and hours respectively between Event time of the data and current time? Format of the Time i.e _time is below 6/18/24 10:17:15.000 AM I tried utilizing the below query which is giving me current event time and current server time in correctly but I need help in finding the difference. index=testdata sourcetype=testmydata | eval currentEventTime=strftime(_time,"%+") |eval currentTimeintheServer= strftime(now(),"%+") | eval diff=round(('currentTimeintheServer'-'currentEventTime') / 60) | eval diff = tostring(diff, "duration") |table currentEventTime currentTimeintheServer diff index _raw Please assist.
@asieira  I tried this query but not working for me and getting Error in 'eval' command: The expression is malformed. An unexpected character is reached at `\"%Y-%m-%dT%H:%M:%SZ\"), \`   same macro... See more...
@asieira  I tried this query but not working for me and getting Error in 'eval' command: The expression is malformed. An unexpected character is reached at `\"%Y-%m-%dT%H:%M:%SZ\"), \`   same macro:  [strftime_utc(2)] args = field, format definition = strftime($field$ - (strptime(strftime($field$, \"%Y-%m-%dT%H:%M:%SZ\"), \"%Y-%m-%dT%H:%M:%S%Z\")-strptime(strftime($field$, \"%Y-%m-%dT%H:%M:%S\"), \"%Y-%m-%dT%H:%M:%S\")), \"$format$\")   and now my search looks like: *My query* | eval utc_time=`strftime_utc(_time, "%Y-%m-%dT%H:%M:%SZ")`  
Any suggestions by anyone or any query to suggest which I can use to leverage to convert and enforce user's input time to UTC time format only ?
Hello Sir/Madam, I am using the on-permise version of the Appdynamics platform. I'm going to check the last features of the EUM component for a web application. While checking the EUM data, everyth... See more...
Hello Sir/Madam, I am using the on-permise version of the Appdynamics platform. I'm going to check the last features of the EUM component for a web application. While checking the EUM data, everything is OK but the page 'Experience Journey Map' shows the following error message. What is the problem root cause? Http failure response for https://mydomain/controller/restui/eum/common/userJourneyUiService/getPathBasedWebUserJourneyTree: 500 OK Morever, the follwoing error occures in the controller log. You can find the full stack trace of the exception in the attached file. [#|2024-06-18T05:37:09.855-0500|WARNING|glassfish 4.1|com.singularity.ee.controller.beans.eumcloud.EUMCloudManagerImpl|_ThreadID=31;_ThreadName=http-listener-1(3);_TimeMillis=1718707029855;_LevelValue=900;|Failed to fetch EUM web user journey for account333356-ss-Mpco-vghtqr2am7n4 com.appdynamics.eum.rest.client.exception.TransportException: Communication failure with service (http://myip:7001/userjourney/v3/web com.appdynamics.eum.client.deps.com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "pageUrlPrefix" (class com.appdynamics.eum.platform.userjourney.query.api.query.PathBasedTreeNode), not marked as ignorable (10 known properties: "outgoingMap", "aggregatedOutgoingQoSList", "outgoingCount", "outgoingNodeDetails", "parentIdString", "incomingCount", "incomingNodeDetails", "name", "nodeIdString", "levelCount"]) at [Source: (com.appdynamics.eum.client.deps.org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream); line: 1, column: 92] (through reference chain: com.appdynamics.eum.platform.userjourney.query.api.result.PathBasedUserJourneyTree["nodes"]->java.util.ArrayList[0]->com.appdynamics.eum.platform.userjourney.query.api.query.PathBasedTreeNode["pageUrlPrefix"]).|#] [#|2024-06-18T05:37:09.860-0500|SEVERE|glassfish 4.1|com.appdynamics.controller.persistence.ControllerExceptionHandlingInterceptor|_ThreadID=31;_ThreadName=http-listener-1(3);_TimeMillis=1718707029860;_LevelValue=1000;|SERVICE=CONTROLLER_BEANS MODULE=CONTROLLER LOGID=ID000401 Encountered a server exception com.singularity.ee.controller.beans.eumcloud.UserFriendlyServerException: Internal server error. See server.log for details. at com.singularity.ee.controller.beans.eumcloud.EUMCloudManagerImpl.queryPathBasedWebUserJourneyTree(EUMCloudManagerImpl.java:3767) at com.appdynamics.platform.persistence.TransactionInterceptor.lambda$invoke$0(TransactionInterceptor.java:37) at com.singularity.ee.controller.beans.model.EJBManagerBean.runWithinRequiredTransaction(EJBManagerBean.java:17) .....
Hi I am using this app https://splunkbase.splunk.com/app/3120 Is it possible to keep the X-axis in view when scrolling down? 1St Example we can see the X-Axis 2nd After moving down - it is g... See more...
Hi I am using this app https://splunkbase.splunk.com/app/3120 Is it possible to keep the X-axis in view when scrolling down? 1St Example we can see the X-Axis 2nd After moving down - it is gone. I know I can hover over time, so like Excel, can we stick the Row. So it moves with the scroll? Thanks in advance Robert   
Hi @Naresh.Pula, Your post has received a reply recently. We're you able to check out what Mario said? If it helps, please click the 'Accept as Solution' button. If not, please continue the convers... See more...
Hi @Naresh.Pula, Your post has received a reply recently. We're you able to check out what Mario said? If it helps, please click the 'Accept as Solution' button. If not, please continue the conversation by replying back to this thread. 
Hi @Dean.Marchetti, Did you create a ticket yet? 
Hi @tatdat171 , I am using python modules which are shipped with Splunk Enterprise so to use my script you need Splunk Enterprise as well (you don't need to start/run Splunk Enterprise locally).
Do I understand correctly that after app upgrade I have to remove my contenct and create it again from scratch?