All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

All your multi-select search is doing (assuming it is based on your base search) is giving you the names of the fields you have described in your base search (Pat, Con and Cov), so why not just hard ... See more...
All your multi-select search is doing (assuming it is based on your base search) is giving you the names of the fields you have described in your base search (Pat, Con and Cov), so why not just hard code them in your multi-select? If you want to continue using the base search, your multi-select search could be simplified to | fields - Category | untable _time Reason CurationValue | table Reason | dedup Reason Having said that, it is still not clear what is not working for you. Do you need something like this? | search Reason IN ($t_reason$)
@MikeyD100 - In your query you have (?) question mark which DB connect understands as param replacement. I don't know if that is intended. If not please try updating the query accordingly.   I hope... See more...
@MikeyD100 - In your query you have (?) question mark which DB connect understands as param replacement. I don't know if that is intended. If not please try updating the query accordingly.   I hope that helps!!!
Hi Everyone!. I'm here to share the resolution for one of the frequent errors that we see in internal logs and sourcetype=splunkd. If you happen to encounter the below error, "Failed processing... See more...
Hi Everyone!. I'm here to share the resolution for one of the frequent errors that we see in internal logs and sourcetype=splunkd. If you happen to encounter the below error, "Failed processing http input , token name=token_name,parsing_err="Incorrect index", index=index_name" Please make sure that your index name is being added to the respective token(HEC). In order to avoid this error, make sure your index is added under the respective token as soon as a new index is created. [https://token name] disabled = 0 index=default_index name indexes=index1,index2, index3;[add your index here]   Cheers      
Hi i have the below data  _time SQL_ID NEWCPUTIME 2023-10-25T12:02:10.140+01:00 ABCD 155.42 2023-10-25T11:57:10.140+01:00 ABCD 146.76 2023-10-25T11:47:10.156+01:... See more...
Hi i have the below data  _time SQL_ID NEWCPUTIME 2023-10-25T12:02:10.140+01:00 ABCD 155.42 2023-10-25T11:57:10.140+01:00 ABCD 146.76 2023-10-25T11:47:10.156+01:00 ABCD 129.34 2023-10-25T11:42:10.163+01:00 ABCD 118.84 2023-10-25T12:07:10.070+01:00 ABCD 163.27 2023-10-25T11:52:10.150+01:00 ABCD 139.34   EXPECTED OUTPUT is   output       _time SQL_ID NEWCPUTIME delta 2023-10-25T12:07:10.070+01:00 ABCD 163.27 7.85 2023-10-25T12:02:10.140+01:00 ABCD 155.42 8.66 2023-10-25T11:57:10.140+01:00 ABCD 146.76 7.42 2023-10-25T11:52:10.150+01:00 ABCD 139.34 10 2023-10-25T11:47:10.156+01:00 ABCD 129.34 10.5 2023-10-25T11:42:10.163+01:00 ABCD 118.84 118.84   SPLUNK  output  which is not correct output       _time SQL_ID NEWCPUTIME delta 2023-10-25T12:07:10.070+01:00 ABCD 163.27   2023-10-25T12:02:10.140+01:00 ABCD 155.42 7.85 2023-10-25T11:57:10.140+01:00 ABCD 146.76 8.66 2023-10-25T11:52:10.150+01:00 ABCD 139.34 7.42 2023-10-25T11:47:10.156+01:00 ABCD 129.34 10 2023-10-25T11:42:10.163+01:00 ABCD 118.84 10.5   im using the below query  index=data sourcetype=dataset source="/usr2/data/data_STATISTICS.txt" SQL_ID= ABCD |streamstats current=f window=1 global=f last(NEWCPUTIME) as last_field by SQL_ID |eval NEW_CPU_VALUE =abs(last_field - NEWCPUTIME) |table _time,SQL_ID, last_field,NEWCPUTIME,NEW_CPU_VALUE   i tried using delta command as well however im not getting the expected output as well 
 I have my base search and Pat, Con and Cov are individual columns. I want those to be the values for my multi-value select. So in my mulit-value select I un-table those columns into rows with the co... See more...
 I have my base search and Pat, Con and Cov are individual columns. I want those to be the values for my multi-value select. So in my mulit-value select I un-table those columns into rows with the column being Reason.    | table _time, Pat, Con, Cov, Category  
How we can measures number of spool in SAP systems using AppDynamics
Did you got any progress on this one  ?
Thanks, but I also think it should be possible to mark everyting between login and logout in a timechart. Maybe it's not possible. If not I will investigae the apps 
You can't fill anything in because you don't differentiate between the login and logout events. The first bar is not necessarily a login followed by a logout, as your first event may be a logout then... See more...
You can't fill anything in because you don't differentiate between the login and logout events. The first bar is not necessarily a login followed by a logout, as your first event may be a logout then a login then another logout. You would need to make your search determine that 4624 is a start login event and the 4634 the logout or end event, rather than just doing a dc(user) which will always be 1. I suggest you look at a couple of apps instead of timechart that are designed for this https://splunkbase.splunk.com/app/4370 https://splunkbase.splunk.com/app/3120  
Hi, I'd like to know how to associate the "url" tag with the web data model. We're currently working with URL logs in our Splunk ES, but we're encountering difficulties in viewing the data model whe... See more...
Hi, I'd like to know how to associate the "url" tag with the web data model. We're currently working with URL logs in our Splunk ES, but we're encountering difficulties in viewing the data model when conducting searches. Could someone kindly provide guidance on this matter? Thanks  
What's the definition of your multiselect input - you've only listed the search. You are using Reason $t_reason$ in your search - but in your chart search, which if it's coming from base search, the... See more...
What's the definition of your multiselect input - you've only listed the search. You are using Reason $t_reason$ in your search - but in your chart search, which if it's coming from base search, there is no reason field, so you cannot filter by reason Is t_category token coming from another input? If you are using a syntax  Reason $t_reason$ and your input is a multiselect, then it looks odd that you have "Reason" in the search - is that just searching the raw text for Reason or is that somehow part of a field called Reason?
Hi @VatsalJagani , Sorry for the delay in coming back to you on this to you. I have read this documentation and got the procedure to only return one OUT REFCURSOR. I don't need to pass in anyth... See more...
Hi @VatsalJagani , Sorry for the delay in coming back to you on this to you. I have read this documentation and got the procedure to only return one OUT REFCURSOR. I don't need to pass in anything so there is no need pass params. create or replace PROCEDURE GET_SOME_LOG_MONITORING (po_error_log_details_01 out SYS_REFCURSOR ) AS......... I can now successfully call the procedure using the below however I get no data returned from the SYS_REFCURSOR | dbxquery connection="SOME-CONNECTION"  procedure="{call ISOMESCHEMA.GET_SOME_LOG_MONITORING(?)}" However when I get a DBA to log into the database directly on the server with the same user that Splunk is using to execute the PROCEDURE they get the following results returned. TIMESTAMP --------------------------------------------------------------------------- CORRELATION -------------------------------------------------------------------------------- TO_CHAR(MSGXML) -------------------------------------------------------------------------------- 25-OCT-23 11.33.40.968589 AM a2306D43d6606aa67f8jgrg21 {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} 25-OCT-23 11.33.44.569205 AM a2306D43d6606aa67f8jgrg21 {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} TIMESTAMP --------------------------------------------------------------------------- CORRELATION -------------------------------------------------------------------------------- TO_CHAR(MSGXML) -------------------------------------------------------------------------------- 25-OCT-23 11.33.47.144192 AM a2306D43d6606aa67f8jgrg21 {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} 25-OCT-23 11.33.49.823233 AM a2306D43d6606aa67f8jgrg21 TIMESTAMP --------------------------------------------------------------------------- CORRELATION -------------------------------------------------------------------------------- TO_CHAR(MSGXML) -------------------------------------------------------------------------------- {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} 25-OCT-23 11.33.51.383443 AM a2306D43d6606aa67f8jgrg21 {"errors":[{"code":"500.12.004","description":"ProviderServiceFailure"}]} 25-OCT-23 11.33.52.708949 AM Splunk has the all the permissions to successfully run the PROCEDURE as proved by this being run directly on the server and data is being returned but when I execute this from the Splunk nothing is returned. 
@danspav for some reason the comma is not removed from the token, so it becomes  form.multi=val1,&form.multi=val2 and therefore in the receiving dashboard, it gets the first multi value as val1,  ... See more...
@danspav for some reason the comma is not removed from the token, so it becomes  form.multi=val1,&form.multi=val2 and therefore in the receiving dashboard, it gets the first multi value as val1,  not quite sure why that regex keeps the comma
Hi @satish ,   I also tried giving "charting.fieldColors"as below . Still there is no change in the pie chart. Please guide {     "type": "splunk.pie",     "title": " Result",     "dataSources... See more...
Hi @satish ,   I also tried giving "charting.fieldColors"as below . Still there is no change in the pie chart. Please guide {     "type": "splunk.pie",     "title": " Result",     "dataSources": {         "primary": "ds_PlnRHbXf"     },     "options": {         "charting.fieldColors": {             "failed": "#FF0000",             "passed": "#008000"         }     },     "context": {},     "showProgressBar": false,     "showLastUpdated": false
Hi, I aimed to merge the "dropped" and "blocked" values under the "IDS_Attacks.action" field in the output of the datamodel search and include their respective counts within the newly created "block... See more...
Hi, I aimed to merge the "dropped" and "blocked" values under the "IDS_Attacks.action" field in the output of the datamodel search and include their respective counts within the newly created "blocked" field. so that I can add it to the dashboard. output:   IDS_Attacks.action count allowed 130016 blocked 595 dropped 1123
Hi @satish , Even after providing the series colors as mentioned by you in the previous comment, i couldnot set/change the color in piechart . Could you please help on how to change the color of pi... See more...
Hi @satish , Even after providing the series colors as mentioned by you in the previous comment, i couldnot set/change the color in piechart . Could you please help on how to change the color of piechart. I want to provide 2 colors red for failed slice and green for pass slice.           "viz_HoWEnZsV": {             "type": "splunk.pie",             "title": "Result",             "dataSources": {                 "primary": "ds_PlnRHbXf"             }         },  "ds_PlnRHbXf": {             "type": "ds.search",             "options": {                 "query": " index = _internal | chart avg(bytes) over source",             },             "seriesColors": ["#61A84F","#FFBF00", "#FF0000"],             "name": "Search_1"   Thanks,  
Hi, Not sure how to fix continius bar between login and logout. As you can see on picture it's marked as login, lot of spaces and then logout. The best would be everything is color marked from login... See more...
Hi, Not sure how to fix continius bar between login and logout. As you can see on picture it's marked as login, lot of spaces and then logout. The best would be everything is color marked from login until logout. Though it could be done throug format, but not this time.  Hope someone can help me with it Rgds
I have a response from one of the client application like this: { "employees": { "2023-03-16": { "1": { "id": 1, "name": "Michael Scott", "email": "demo@desktime.com", "groupId": 1, "group": "Accoun... See more...
I have a response from one of the client application like this: { "employees": { "2023-03-16": { "1": { "id": 1, "name": "Michael Scott", "email": "demo@desktime.com", "groupId": 1, "group": "Accounting", "profileUrl": "url.com", "isOnline": false, "arrived": false, "left": false, "late": false, "onlineTime": 0, "offlineTime": 0, "desktimeTime": 0, "atWorkTime": 0, "afterWorkTime": 0, "beforeWorkTime": 0, "productiveTime": 0, "productivity": 0, "efficiency": 0, "work_starts": "23:59:59", "work_ends": "00:00:00", "notes": { "Skype": "Find.me", "Slack": "MichielS" }, "activeProject": [] }, "2": { "id": 2, "name": "Andy Bernard", "email": "demo3@desktime.com", "groupId": 106345, "group": "Marketing", "profileUrl": "url.com", "isOnline": true, "arrived": "2023-03-16 09:17:00", "left": "2023-03-16 10:58:00", "late": true, "onlineTime": 6027, "offlineTime": 0, "desktimeTime": 6027, "atWorkTime": 6060, "afterWorkTime": 0, "beforeWorkTime": 0, "productiveTime": 4213, "productivity": 69.9, "efficiency": 14.75, "work_starts": "09:00:00", "work_ends": "18:00:00", "notes": { "Background": "Law and accounting" }, "activeProject": { "project_id": 67973, "project_title": "Blue Book", "task_id": 42282, "task_title": "Blue Book task", "duration": 6027 } }..... } "__request_time": "1678957028" }  I am facing problem with the date field "2023-03-16" as this field changes everyday. I wanted to create statistics based on all Employee IDs, Late employees, Email etc for last 7 days. I have used Spath  but cannot use wildcard search on all Late employees on all days. Thanks
Hi @bambarita , using your link I can access to the login page. Maybe there's some limitation in your proxy (if you have it). Ciao. Giuseppe
Hi @fredclown , It will be there by default, no need of defining again !