All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

How we can measures number of spool in SAP systems using AppDynamics
Did you got any progress on this one  ?
Thanks, but I also think it should be possible to mark everyting between login and logout in a timechart. Maybe it's not possible. If not I will investigae the apps 
You can't fill anything in because you don't differentiate between the login and logout events. The first bar is not necessarily a login followed by a logout, as your first event may be a logout then... See more...
You can't fill anything in because you don't differentiate between the login and logout events. The first bar is not necessarily a login followed by a logout, as your first event may be a logout then a login then another logout. You would need to make your search determine that 4624 is a start login event and the 4634 the logout or end event, rather than just doing a dc(user) which will always be 1. I suggest you look at a couple of apps instead of timechart that are designed for this https://splunkbase.splunk.com/app/4370 https://splunkbase.splunk.com/app/3120  
Hi, I'd like to know how to associate the "url" tag with the web data model. We're currently working with URL logs in our Splunk ES, but we're encountering difficulties in viewing the data model whe... See more...
Hi, I'd like to know how to associate the "url" tag with the web data model. We're currently working with URL logs in our Splunk ES, but we're encountering difficulties in viewing the data model when conducting searches. Could someone kindly provide guidance on this matter? Thanks  
What's the definition of your multiselect input - you've only listed the search. You are using Reason $t_reason$ in your search - but in your chart search, which if it's coming from base search, the... See more...
What's the definition of your multiselect input - you've only listed the search. You are using Reason $t_reason$ in your search - but in your chart search, which if it's coming from base search, there is no reason field, so you cannot filter by reason Is t_category token coming from another input? If you are using a syntax  Reason $t_reason$ and your input is a multiselect, then it looks odd that you have "Reason" in the search - is that just searching the raw text for Reason or is that somehow part of a field called Reason?
Hi @VatsalJagani , Sorry for the delay in coming back to you on this to you. I have read this documentation and got the procedure to only return one OUT REFCURSOR. I don't need to pass in anyth... See more...
Hi @VatsalJagani , Sorry for the delay in coming back to you on this to you. I have read this documentation and got the procedure to only return one OUT REFCURSOR. I don't need to pass in anything so there is no need pass params. create or replace PROCEDURE GET_SOME_LOG_MONITORING (po_error_log_details_01 out SYS_REFCURSOR ) AS......... I can now successfully call the procedure using the below however I get no data returned from the SYS_REFCURSOR | dbxquery connection="SOME-CONNECTION"  procedure="{call ISOMESCHEMA.GET_SOME_LOG_MONITORING(?)}" However when I get a DBA to log into the database directly on the server with the same user that Splunk is using to execute the PROCEDURE they get the following results returned. TIMESTAMP --------------------------------------------------------------------------- CORRELATION -------------------------------------------------------------------------------- TO_CHAR(MSGXML) -------------------------------------------------------------------------------- 25-OCT-23 11.33.40.968589 AM a2306D43d6606aa67f8jgrg21 {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} 25-OCT-23 11.33.44.569205 AM a2306D43d6606aa67f8jgrg21 {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} TIMESTAMP --------------------------------------------------------------------------- CORRELATION -------------------------------------------------------------------------------- TO_CHAR(MSGXML) -------------------------------------------------------------------------------- 25-OCT-23 11.33.47.144192 AM a2306D43d6606aa67f8jgrg21 {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} 25-OCT-23 11.33.49.823233 AM a2306D43d6606aa67f8jgrg21 TIMESTAMP --------------------------------------------------------------------------- CORRELATION -------------------------------------------------------------------------------- TO_CHAR(MSGXML) -------------------------------------------------------------------------------- {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} 25-OCT-23 11.33.51.383443 AM a2306D43d6606aa67f8jgrg21 {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} 25-OCT-23 11.33.52.708949 AM Splunk has the all the permissions to successfully run the PROCEDURE as proved by this being run directly on the server and data is being returned but when I execute this from the Splunk nothing is returned. 
@danspav for some reason the comma is not removed from the token, so it becomes  form.multi=val1,&form.multi=val2 and therefore in the receiving dashboard, it gets the first multi value as val1,  ... See more...
@danspav for some reason the comma is not removed from the token, so it becomes  form.multi=val1,&form.multi=val2 and therefore in the receiving dashboard, it gets the first multi value as val1,  not quite sure why that regex keeps the comma
Hi @satish ,   I also tried giving "charting.fieldColors"as below . Still there is no change in the pie chart. Please guide {     "type": "splunk.pie",     "title": " Result",     "dataSources... See more...
Hi @satish ,   I also tried giving "charting.fieldColors"as below . Still there is no change in the pie chart. Please guide {     "type": "splunk.pie",     "title": " Result",     "dataSources": {         "primary": "ds_PlnRHbXf"     },     "options": {         "charting.fieldColors": {             "failed": "#FF0000",             "passed": "#008000"         }     },     "context": {},     "showProgressBar": false,     "showLastUpdated": false
Hi, I aimed to merge the "dropped" and "blocked" values under the "IDS_Attacks.action" field in the output of the datamodel search and include their respective counts within the newly created "block... See more...
Hi, I aimed to merge the "dropped" and "blocked" values under the "IDS_Attacks.action" field in the output of the datamodel search and include their respective counts within the newly created "blocked" field. so that I can add it to the dashboard. output:   IDS_Attacks.action count allowed 130016 blocked 595 dropped 1123
Hi @satish , Even after providing the series colors as mentioned by you in the previous comment, i couldnot set/change the color in piechart . Could you please help on how to change the color of pi... See more...
Hi @satish , Even after providing the series colors as mentioned by you in the previous comment, i couldnot set/change the color in piechart . Could you please help on how to change the color of piechart. I want to provide 2 colors red for failed slice and green for pass slice.           "viz_HoWEnZsV": {             "type": "splunk.pie",             "title": "Result",             "dataSources": {                 "primary": "ds_PlnRHbXf"             }         },  "ds_PlnRHbXf": {             "type": "ds.search",             "options": {                 "query": " index = _internal | chart avg(bytes) over source",             },             "seriesColors": ["#61A84F","#FFBF00", "#FF0000"],             "name": "Search_1"   Thanks,  
Hi, Not sure how to fix continius bar between login and logout. As you can see on picture it's marked as login, lot of spaces and then logout. The best would be everything is color marked from login... See more...
Hi, Not sure how to fix continius bar between login and logout. As you can see on picture it's marked as login, lot of spaces and then logout. The best would be everything is color marked from login until logout. Though it could be done throug format, but not this time.  Hope someone can help me with it Rgds
I have a response from one of the client application like this: { "employees": { "2023-03-16": { "1": { "id": 1, "name": "Michael Scott", "email": "demo@desktime.com", "groupId": 1, "group": "Accoun... See more...
I have a response from one of the client application like this: { "employees": { "2023-03-16": { "1": { "id": 1, "name": "Michael Scott", "email": "demo@desktime.com", "groupId": 1, "group": "Accounting", "profileUrl": "url.com", "isOnline": false, "arrived": false, "left": false, "late": false, "onlineTime": 0, "offlineTime": 0, "desktimeTime": 0, "atWorkTime": 0, "afterWorkTime": 0, "beforeWorkTime": 0, "productiveTime": 0, "productivity": 0, "efficiency": 0, "work_starts": "23:59:59", "work_ends": "00:00:00", "notes": { "Skype": "Find.me", "Slack": "MichielS" }, "activeProject": [] }, "2": { "id": 2, "name": "Andy Bernard", "email": "demo3@desktime.com", "groupId": 106345, "group": "Marketing", "profileUrl": "url.com", "isOnline": true, "arrived": "2023-03-16 09:17:00", "left": "2023-03-16 10:58:00", "late": true, "onlineTime": 6027, "offlineTime": 0, "desktimeTime": 6027, "atWorkTime": 6060, "afterWorkTime": 0, "beforeWorkTime": 0, "productiveTime": 4213, "productivity": 69.9, "efficiency": 14.75, "work_starts": "09:00:00", "work_ends": "18:00:00", "notes": { "Background": "Law and accounting" }, "activeProject": { "project_id": 67973, "project_title": "Blue Book", "task_id": 42282, "task_title": "Blue Book task", "duration": 6027 } }..... } "__request_time": "1678957028" }  I am facing problem with the date field "2023-03-16" as this field changes everyday. I wanted to create statistics based on all Employee IDs, Late employees, Email etc for last 7 days. I have used Spath  but cannot use wildcard search on all Late employees on all days. Thanks
Hi @bambarita , using your link I can access to the login page. Maybe there's some limitation in your proxy (if you have it). Ciao. Giuseppe
Hi @fredclown , It will be there by default, no need of defining again !
Basically, still not a question. If it is a question, what sort of answer are you expecting?
  Basically, this is a question , able to see events till 4:00 am and after that not able to see. With the below query able to check the last events :- | tstats  count where index=cat by host, i... See more...
  Basically, this is a question , able to see events till 4:00 am and after that not able to see. With the below query able to check the last events :- | tstats  count where index=cat by host, index, source, sourcetype, _time | search host=* |sort _time @ITWhisperer
yes, the link is still invalid contact https://splunk.my.site.com/customer/
>>> we are facing disk storage warning on license master. is it a clustered environment? is it a single indexer, single SH environment? how critical the data is? the license master co-host's any o... See more...
>>> we are facing disk storage warning on license master. is it a clustered environment? is it a single indexer, single SH environment? how critical the data is? the license master co-host's any other splunk instance?(or there any other SH/indexer created along with the License Master?) >>> Can you suggest if we can remove some DB files. Are these directories huge or small(if they are small, deleting may not save more disk ! )
I am getting the error: (502) Insufficient Privileges: You do not have View privilege on Course I am enrolled for the splunk Power user training and i cannot access my learning path because of the e... See more...
I am getting the error: (502) Insufficient Privileges: You do not have View privilege on Course I am enrolled for the splunk Power user training and i cannot access my learning path because of the error.