All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

How to find difference of the time in days and hours respectively between Event time of the data and current time? Format of the Time i.e _time is below 6/18/24 10:17:15.000 AM I tried utiliz... See more...
How to find difference of the time in days and hours respectively between Event time of the data and current time? Format of the Time i.e _time is below 6/18/24 10:17:15.000 AM I tried utilizing the below query which is giving me current event time and current server time in correctly but I need help in finding the difference. index=testdata sourcetype=testmydata | eval currentEventTime=strftime(_time,"%+") |eval currentTimeintheServer= strftime(now(),"%+") | eval diff=round(('currentTimeintheServer'-'currentEventTime') / 60) | eval diff = tostring(diff, "duration") |table currentEventTime currentTimeintheServer diff index _raw Please assist.
@asieira  I tried this query but not working for me and getting Error in 'eval' command: The expression is malformed. An unexpected character is reached at `\"%Y-%m-%dT%H:%M:%SZ\"), \`   same macro... See more...
@asieira  I tried this query but not working for me and getting Error in 'eval' command: The expression is malformed. An unexpected character is reached at `\"%Y-%m-%dT%H:%M:%SZ\"), \`   same macro:  [strftime_utc(2)] args = field, format definition = strftime($field$ - (strptime(strftime($field$, \"%Y-%m-%dT%H:%M:%SZ\"), \"%Y-%m-%dT%H:%M:%S%Z\")-strptime(strftime($field$, \"%Y-%m-%dT%H:%M:%S\"), \"%Y-%m-%dT%H:%M:%S\")), \"$format$\")   and now my search looks like: *My query* | eval utc_time=`strftime_utc(_time, "%Y-%m-%dT%H:%M:%SZ")`  
Any suggestions by anyone or any query to suggest which I can use to leverage to convert and enforce user's input time to UTC time format only ?
Hello Sir/Madam, I am using the on-permise version of the Appdynamics platform. I'm going to check the last features of the EUM component for a web application. While checking the EUM data, everyth... See more...
Hello Sir/Madam, I am using the on-permise version of the Appdynamics platform. I'm going to check the last features of the EUM component for a web application. While checking the EUM data, everything is OK but the page 'Experience Journey Map' shows the following error message. What is the problem root cause? Http failure response for https://mydomain/controller/restui/eum/common/userJourneyUiService/getPathBasedWebUserJourneyTree: 500 OK Morever, the follwoing error occures in the controller log. You can find the full stack trace of the exception in the attached file. [#|2024-06-18T05:37:09.855-0500|WARNING|glassfish 4.1|com.singularity.ee.controller.beans.eumcloud.EUMCloudManagerImpl|_ThreadID=31;_ThreadName=http-listener-1(3);_TimeMillis=1718707029855;_LevelValue=900;|Failed to fetch EUM web user journey for account333356-ss-Mpco-vghtqr2am7n4 com.appdynamics.eum.rest.client.exception.TransportException: Communication failure with service (http://myip:7001/userjourney/v3/web com.appdynamics.eum.client.deps.com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "pageUrlPrefix" (class com.appdynamics.eum.platform.userjourney.query.api.query.PathBasedTreeNode), not marked as ignorable (10 known properties: "outgoingMap", "aggregatedOutgoingQoSList", "outgoingCount", "outgoingNodeDetails", "parentIdString", "incomingCount", "incomingNodeDetails", "name", "nodeIdString", "levelCount"]) at [Source: (com.appdynamics.eum.client.deps.org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream); line: 1, column: 92] (through reference chain: com.appdynamics.eum.platform.userjourney.query.api.result.PathBasedUserJourneyTree["nodes"]->java.util.ArrayList[0]->com.appdynamics.eum.platform.userjourney.query.api.query.PathBasedTreeNode["pageUrlPrefix"]).|#] [#|2024-06-18T05:37:09.860-0500|SEVERE|glassfish 4.1|com.appdynamics.controller.persistence.ControllerExceptionHandlingInterceptor|_ThreadID=31;_ThreadName=http-listener-1(3);_TimeMillis=1718707029860;_LevelValue=1000;|SERVICE=CONTROLLER_BEANS MODULE=CONTROLLER LOGID=ID000401 Encountered a server exception com.singularity.ee.controller.beans.eumcloud.UserFriendlyServerException: Internal server error. See server.log for details. at com.singularity.ee.controller.beans.eumcloud.EUMCloudManagerImpl.queryPathBasedWebUserJourneyTree(EUMCloudManagerImpl.java:3767) at com.appdynamics.platform.persistence.TransactionInterceptor.lambda$invoke$0(TransactionInterceptor.java:37) at com.singularity.ee.controller.beans.model.EJBManagerBean.runWithinRequiredTransaction(EJBManagerBean.java:17) .....
Hi I am using this app https://splunkbase.splunk.com/app/3120 Is it possible to keep the X-axis in view when scrolling down? 1St Example we can see the X-Axis 2nd After moving down - it is g... See more...
Hi I am using this app https://splunkbase.splunk.com/app/3120 Is it possible to keep the X-axis in view when scrolling down? 1St Example we can see the X-Axis 2nd After moving down - it is gone. I know I can hover over time, so like Excel, can we stick the Row. So it moves with the scroll? Thanks in advance Robert   
Hi @Naresh.Pula, Your post has received a reply recently. We're you able to check out what Mario said? If it helps, please click the 'Accept as Solution' button. If not, please continue the convers... See more...
Hi @Naresh.Pula, Your post has received a reply recently. We're you able to check out what Mario said? If it helps, please click the 'Accept as Solution' button. If not, please continue the conversation by replying back to this thread. 
Hi @Dean.Marchetti, Did you create a ticket yet? 
Hi @tatdat171 , I am using python modules which are shipped with Splunk Enterprise so to use my script you need Splunk Enterprise as well (you don't need to start/run Splunk Enterprise locally).
Do I understand correctly that after app upgrade I have to remove my contenct and create it again from scratch?
Hi @Poojitha, to do this you don't need to define fields at index time, but also at search time you can load your data in Data Models. Ciao. Giuseppe
In order to be able to debug your code, it might be useful to see your actual code, or at least a cut-down version of your code which demonstrates the problem. Also, does it occur with large dashboar... See more...
In order to be able to debug your code, it might be useful to see your actual code, or at least a cut-down version of your code which demonstrates the problem. Also, does it occur with large dashboards, or only small ones? Does it occur with fresh browser instances or old? Does it occur with different browsers or just one? Which browser(s) does it occur with? Any other information like this might give a clue as to what it happening.
Hi @quadrant8 , 10k events is the limit of subsearch results: if you run the subsearch as a main search, without anithing, have you more or less of 10K events? if more than 10K events, you have to ... See more...
Hi @quadrant8 , 10k events is the limit of subsearch results: if you run the subsearch as a main search, without anithing, have you more or less of 10K events? if more than 10K events, you have to find a different solution (e.g. putting the subsearch in the main search with an OR condition, defining a correlation key and checking that the correlation key is present in both the searches. Ciao. Giuseppe
The answer in this splunk blog post. Somewhere in "System Configuration" we can configure integration with ES. Nuance - I opened this settings menu once, but the second time I can’t find it
Hi @harsmarvania57 , I am very appreciate what you did with the script! In your script, you are using the python package "splunk" from splunk import mergeHostPath import splunk.rest as rest im... See more...
Hi @harsmarvania57 , I am very appreciate what you did with the script! In your script, you are using the python package "splunk" from splunk import mergeHostPath import splunk.rest as rest import splunk.auth as auth import splunk.entity as entity   But I can't find the package with pip install. could you give the correct name of package?
Hi @tatdat171 Script which I have created is not out of date. It still works for On-prem and SplunkCloud. I would like to know which functions didn't work for you. Thanks, Harshil 
Hi Team, We are currently using Classic XML and have made the panels Collapsible/Expandable using HTML/CSS with suggestion from below thread: https://community.splunk.com/t5/Dashboards-Visualizat... See more...
Hi Team, We are currently using Classic XML and have made the panels Collapsible/Expandable using HTML/CSS with suggestion from below thread: https://community.splunk.com/t5/Dashboards-Visualizations/How-to-add-feature-expand-or-collapse-panel-in-dashboard-using/m-p/506986 However, sometimes during first dashboard load, both the "+" and "-" sign are visible. This happens occasionally, so I am not able to find the cause for this. Do you have any suggestions or ideas to fix this? Thank you!
Better late than never:  Sample data would be helpful here.  The request is a bit confusing since you seem to want the top 5 urls per status code, but your URL count stops at 10. With 3 status cod... See more...
Better late than never:  Sample data would be helpful here.  The request is a bit confusing since you seem to want the top 5 urls per status code, but your URL count stops at 10. With 3 status codes, the top 5 could go to 15, right? For the second point, what UserID  would that be? Presumably each URL could be hit by multiple users, and the top 5 codes for each URL would differ per user. 
First load the lookups and then group both realms using stats. Try to do something like this and adjust it to your needs,  assuming there is a field that is common in both data sets:   |inputlooku... See more...
First load the lookups and then group both realms using stats. Try to do something like this and adjust it to your needs,  assuming there is a field that is common in both data sets:   |inputlookup lookup1 |inputlookup lookup2 append=true | stats values(fieldA) AS fieldA (...) by fieldB_common_in_both_datasets     If there is not common field, use rename or eval to create that common field before the stats: | inputlookup lookup1 | inputlookup lookup2 append=true | rename fieldC as fieldB | stats values(fieldA) AS fieldA (...) by fieldB      
Better late than never: This needs more information on what you consider month and week boundaries. Does "January, Week 1" mean the first 7 days of January, January 1st to the last day of the week... See more...
Better late than never: This needs more information on what you consider month and week boundaries. Does "January, Week 1" mean the first 7 days of January, January 1st to the last day of the week (e.g. Saturday), or the Sunday before January 1st, to the following Saturday? When you say by week and month together, do you just want a label for the month in front of the 52 weeks?  
I've seen the documentation which says "by default subsearches return a maximum of 10,000 results and have a maximum runtime of 60 seconds", but it's unclear if that limit is before or after applying... See more...
I've seen the documentation which says "by default subsearches return a maximum of 10,000 results and have a maximum runtime of 60 seconds", but it's unclear if that limit is before or after applying transforms.   e.g. does it apply to the base search (e.g. the output of index=wineventlogs AND ComputerName=MyDesktop is capped at 10k) or if the filtered results (e.g. if I add conditions and filter to reduce the final dataset) is where any results over 10k will be dropped?