All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

How can I get the SVC Usage for Saved Searches and Ad-hoc searches?  These logs don't have it. index="_internal" data.searchType="scheduled"   index="_audit" sourcetype="audittrail" action="search... See more...
How can I get the SVC Usage for Saved Searches and Ad-hoc searches?  These logs don't have it. index="_internal" data.searchType="scheduled"   index="_audit" sourcetype="audittrail" action="search" info="completed"  
Thank you! This is really helpful.
I had the same symptoms. In my case, the root cause of the problem was that I was attempting to use the "Add Data" upload functionality on the search-head (i.e. at https://myhost.splunkcloud.com/) i... See more...
I had the same symptoms. In my case, the root cause of the problem was that I was attempting to use the "Add Data" upload functionality on the search-head (i.e. at https://myhost.splunkcloud.com/) instead of at the IDM (i.e. at https://idm.myhost.splunkcloud.com/). Attempting to upload it there worked without problems.   I do find it frustrating that both the IDM and the search head seem to use the same interface and you just need to expect various chunks of functionality to be broken on one and work on the other, but I guess as far as things go it's not too bad to just have a principle where "if something fails on one, try it on the other".
I see what you mean about missing the | addinfo, my mistake.  I tried 3 different ways.  With time picker token and without. I also tried with No token and with | addinfo but I'm still getting an err... See more...
I see what you mean about missing the | addinfo, my mistake.  I tried 3 different ways.  With time picker token and without. I also tried with No token and with | addinfo but I'm still getting an error.    
Hi! Faced with writing a query with an additional check and I can't find a way out. I will be glad if you tell me the direction or help with advice. We have the following custom logic: 1. When use... See more...
Hi! Faced with writing a query with an additional check and I can't find a way out. I will be glad if you tell me the direction or help with advice. We have the following custom logic: 1. When user do some action(it is not important) we generate an event in index=custom with the following fields: evt_id: 1,  user_id: 555 (example) 2. The user should confirm that he is doing this "some action" in third-party app, and this app generate to the index=custom the next event: evt_id: 2, user_id:555 (example) msg:confirmed 3. If user NOT CONFIRMED the SOME ACTION from step 1 - we need to generate alert. It means, that Splunk didn't receive evt_id:2 in index=custom  The alert logic is following: We need to alert when  evt_id: 1 was more than 5 minutes ago(the time that the user has to confirm "some action') and when NO evt_id: 2 with the same user_id by the time the alert starts working.  I understood that I need to do the first search like(example): index=custom evt_id=1 earliest=-5m latest=-7m But I have no idea how to implement additional condition with evt_id:2. if we didn't have the user_id field, then I could use stats  count command but I need  to correlate both events(1 and 2) with the field user_id.  Thanks for you help, have a nice day.
Thank you. I knew there was probably some way to iterate, but couldnt figure it out. Thank you. 
Thank you. That did the trick. Adding a  | stats values(open_ports) by destination  allows me to group and add them all in one row.  Thank you again for the prompt help  
Splunk PS installed UBA a while back, and I just noticed that we are not getting OS logs from those servers into Splunk Enterprise.  Since we have a 10 node cluster, I was trying to find a quicker wa... See more...
Splunk PS installed UBA a while back, and I just noticed that we are not getting OS logs from those servers into Splunk Enterprise.  Since we have a 10 node cluster, I was trying to find a quicker way to manage them.  Is there a reason I shouldn't connect the Splunk Enterprise running on all of those nodes to the deployment server?
Hello, community, I wanted to share a challenge that I have mapping fields to Data Models.  The issue is that I have identified/created fields that are required for a Deta Set, but they are not aut... See more...
Hello, community, I wanted to share a challenge that I have mapping fields to Data Models.  The issue is that I have identified/created fields that are required for a Deta Set, but they are not auto-populating e.g. cannot be seen by the Data Model/Set. Any suggestions of where I might be getting wrong? Regards, Dan 
i tried couple of time for the same error but seen the same error. after that i tried above work around   $SPLUNK_HOME/etc/apps/ForensicInvestigator/default/data/ui/views iconv -f UTF-16LE -t... See more...
i tried couple of time for the same error but seen the same error. after that i tried above work around   $SPLUNK_HOME/etc/apps/ForensicInvestigator/default/data/ui/views iconv -f UTF-16LE -t UTF-8 portsservices.xml -o portsservices1.xml i removed the old file and renamed the portsservices1.xml to portsservices.xml and it worked for me  after that i started the splunk everything is working as expected.
I strongly advise against modifying datamodels that are not your own.  If you change a DM, your changes will override any future versions of the DM that may be released. Instead, have your dashboard... See more...
I strongly advise against modifying datamodels that are not your own.  If you change a DM, your changes will override any future versions of the DM that may be released. Instead, have your dashboard combine the values by changing "dropped" to "blocked". | eval IDS_Attacks.action=if(IDS_Attacks.action="dropped","blocked",IDS_Attacks.action)  
As per the documentation you need to have the (?) when using Oracle, as I understand it that is for the OUT REFCURSOR, if I take it out I get an error. Plus when there is no data to return I g... See more...
As per the documentation you need to have the (?) when using Oracle, as I understand it that is for the OUT REFCURSOR, if I take it out I get an error. Plus when there is no data to return I get this from Splunk so it works with (?) Which matches what I see when the SQL run directly from the server The problem seems to be when there is data to be returned but I am not sure what the issue is. 
The output may not be what is desired, but it is correct.  The streamstats and delta commands compute the difference between the current result and the previous result rather than between the current... See more...
The output may not be what is desired, but it is correct.  The streamstats and delta commands compute the difference between the current result and the previous result rather than between the current result and the next result (which is unseen at the time). One workaround may be to surround the streamstats or delta command with reverse, which will change the order of events and then change it back. index=data sourcetype=dataset source="/usr2/data/data_STATISTICS.txt" SQL_ID= ABCD | reverse |streamstats current=f window=1 global=f last(NEWCPUTIME) as last_field by SQL_ID | reverse |eval NEW_CPU_VALUE =abs(last_field - NEWCPUTIME) |table _time,SQL_ID, last_field,NEWCPUTIME,NEW_CPU_VALUE  
Is it possible to display textual (string) values instead of numbers on the Y axis? I have a time series with a field called "state", which contains an integer number. Each number represents a cer... See more...
Is it possible to display textual (string) values instead of numbers on the Y axis? I have a time series with a field called "state", which contains an integer number. Each number represents a certain state. Examples: 0="off", 1="on" 0="off", 1="degraded", 2="standby", 3="normal", 4="boost" Now I would like to have a line or bar chart showing the respective words on the Y axis ticks instead of 0, 1, 2, 3, 4. Note: This was already asked but not answered satisfactorily: https://community.splunk.com/t5/Splunk-Search/Is-it-possible-to-make-y-axis-labels-display-quot-on-quot-and/m-p/222217 
@richgalloway Hi , Tried the below one. we are getting error as below. Error in where command: The operator at '::trapdown AND _time<=relative_time(now(),"-5m") is invalid. Please help me. Thanks!
All your multi-select search is doing (assuming it is based on your base search) is giving you the names of the fields you have described in your base search (Pat, Con and Cov), so why not just hard ... See more...
All your multi-select search is doing (assuming it is based on your base search) is giving you the names of the fields you have described in your base search (Pat, Con and Cov), so why not just hard code them in your multi-select? If you want to continue using the base search, your multi-select search could be simplified to | fields - Category | untable _time Reason CurationValue | table Reason | dedup Reason Having said that, it is still not clear what is not working for you. Do you need something like this? | search Reason IN ($t_reason$)
@MikeyD100 - In your query you have (?) question mark which DB connect understands as param replacement. I don't know if that is intended. If not please try updating the query accordingly.   I hope... See more...
@MikeyD100 - In your query you have (?) question mark which DB connect understands as param replacement. I don't know if that is intended. If not please try updating the query accordingly.   I hope that helps!!!
Hi Everyone!. I'm here to share the resolution for one of the frequent errors that we see in internal logs and sourcetype=splunkd. If you happen to encounter the below error, "Failed processing... See more...
Hi Everyone!. I'm here to share the resolution for one of the frequent errors that we see in internal logs and sourcetype=splunkd. If you happen to encounter the below error, "Failed processing http input , token name=token_name,parsing_err="Incorrect index", index=index_name" Please make sure that your index name is being added to the respective token(HEC). In order to avoid this error, make sure your index is added under the respective token as soon as a new index is created. [https://token name] disabled = 0 index=default_index name indexes=index1,index2, index3;[add your index here]   Cheers      
Hi i have the below data  _time SQL_ID NEWCPUTIME 2023-10-25T12:02:10.140+01:00 ABCD 155.42 2023-10-25T11:57:10.140+01:00 ABCD 146.76 2023-10-25T11:47:10.156+01:... See more...
Hi i have the below data  _time SQL_ID NEWCPUTIME 2023-10-25T12:02:10.140+01:00 ABCD 155.42 2023-10-25T11:57:10.140+01:00 ABCD 146.76 2023-10-25T11:47:10.156+01:00 ABCD 129.34 2023-10-25T11:42:10.163+01:00 ABCD 118.84 2023-10-25T12:07:10.070+01:00 ABCD 163.27 2023-10-25T11:52:10.150+01:00 ABCD 139.34   EXPECTED OUTPUT is   output       _time SQL_ID NEWCPUTIME delta 2023-10-25T12:07:10.070+01:00 ABCD 163.27 7.85 2023-10-25T12:02:10.140+01:00 ABCD 155.42 8.66 2023-10-25T11:57:10.140+01:00 ABCD 146.76 7.42 2023-10-25T11:52:10.150+01:00 ABCD 139.34 10 2023-10-25T11:47:10.156+01:00 ABCD 129.34 10.5 2023-10-25T11:42:10.163+01:00 ABCD 118.84 118.84   SPLUNK  output  which is not correct output       _time SQL_ID NEWCPUTIME delta 2023-10-25T12:07:10.070+01:00 ABCD 163.27   2023-10-25T12:02:10.140+01:00 ABCD 155.42 7.85 2023-10-25T11:57:10.140+01:00 ABCD 146.76 8.66 2023-10-25T11:52:10.150+01:00 ABCD 139.34 7.42 2023-10-25T11:47:10.156+01:00 ABCD 129.34 10 2023-10-25T11:42:10.163+01:00 ABCD 118.84 10.5   im using the below query  index=data sourcetype=dataset source="/usr2/data/data_STATISTICS.txt" SQL_ID= ABCD |streamstats current=f window=1 global=f last(NEWCPUTIME) as last_field by SQL_ID |eval NEW_CPU_VALUE =abs(last_field - NEWCPUTIME) |table _time,SQL_ID, last_field,NEWCPUTIME,NEW_CPU_VALUE   i tried using delta command as well however im not getting the expected output as well 
 I have my base search and Pat, Con and Cov are individual columns. I want those to be the values for my multi-value select. So in my mulit-value select I un-table those columns into rows with the co... See more...
 I have my base search and Pat, Con and Cov are individual columns. I want those to be the values for my multi-value select. So in my mulit-value select I un-table those columns into rows with the column being Reason.    | table _time, Pat, Con, Cov, Category