All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I recently installed brand new Splunk 8.2.2, then installed Splunk ES 6.6.0 on it, after Splunk ES installed and configured, I restarted Splunk from CLI, from that I got below error message: "Checki... See more...
I recently installed brand new Splunk 8.2.2, then installed Splunk ES 6.6.0 on it, after Splunk ES installed and configured, I restarted Splunk from CLI, from that I got below error message: "Checking conf files for problems... Invalid key in stanza [notable] in /opt/splunk/etc/apps/SA-ThreatIntelligence/default/alert_actions.conf, line 84: param.default_disposition (value: )."   There is no such error on Splunk ES 6.4.1, and there is also no such key, it's new from ES 6.6.0, who knows how to fix it? many thanks!
Hi all, I am trying to setup some sort of dashboard to view a list of sudo commands by server.  I started with the IT Essentials Learn App which recommends this command:   index=* sourcetype=linu... See more...
Hi all, I am trying to setup some sort of dashboard to view a list of sudo commands by server.  I started with the IT Essentials Learn App which recommends this command:   index=* sourcetype=linux_secure process=sudo COMMAND=* host=* | rex "COMMAND=(?<raw_command>.*)" | eval COMMAND=coalesce(raw_command, COMMAND) | table _time host USER PWD COMMAND     This command did not work for me so I started playing with it a bit. I realized that the    sourcetype=linux_secure   does not exist.    My understanding is that the splunk add-on for unix and linux is supposed to apply this sourcetype. I verified my configuration and didn't see anything to modify so I went ahead and looked at the $SPLUNK/etc/apps/Splunk_TA_nix/default/inputs.conf file. I cannot find a single instance of sourcetype=linux_secure in that config file so I don't think that sourcetype is being applied to any sources. Has linux_secure been deprecated or do I simply need to modify my local/inputs.conf file with something?   Does anyone have a recommended way to perform this search? I have tried a number of methods but am struggling to get what I need.
I'm working on enhancing our data pipeline by leveraging the use of a messaging bus such as Kafka or Pulsar.  Both are enticing options, however they each come with their own advantages and drawbacks... See more...
I'm working on enhancing our data pipeline by leveraging the use of a messaging bus such as Kafka or Pulsar.  Both are enticing options, however they each come with their own advantages and drawbacks.  This installation will be dedicated to Splunk, so no shared messaging busses interfering with our logging needs.  I would love to know what have been the experiences of the user community when using either of these platforms?  Why did you choose one over the other?  Have you regretted the choice and why? Thanks. The Frunkster
I am looking for a way to filter the events that a user can see based on the values of the event. For example, if there are events with the field 'building' and the field has values 'a' through 'z', ... See more...
I am looking for a way to filter the events that a user can see based on the values of the event. For example, if there are events with the field 'building' and the field has values 'a' through 'z', I would want user 1 to only be able to retrieve events where the building is of a value 'a' though 'g', and user 2 could be given access to events where the building values are 'f' though 'p'. I have looked into using roles to apply filters, but those are limited to indexed fields and I will have dozens of fields in my events that will need this type of filtering, so it is not a good option. Additionally, the filtering should be secure so that there is no way for users to bypass that filtering. Any ideas?
I created an Access Policy in Azure. How do I configure the Storage Account to use the Access Policy https://docs.splunk.com/Documentation/AddOns/released/MSCloudServices/Configurestorageaccount? In... See more...
I created an Access Policy in Azure. How do I configure the Storage Account to use the Access Policy https://docs.splunk.com/Documentation/AddOns/released/MSCloudServices/Configurestorageaccount? In Azure there is an Access Policy Identifier assigned to the Access Policy, where does this get entered into the Storage Account form?
I've created custom python input via add-on builder,  called events from API for last 24 hours and data are there. I can see them in the data builder in response. However, add-on I am trying to crea... See more...
I've created custom python input via add-on builder,  called events from API for last 24 hours and data are there. I can see them in the data builder in response. However, add-on I am trying to create needs to run on 60 seconds schedule. At Edit Data Input I've selected Collection Interval of 60 seconds and in my add-on I look to the past 60 seconds.  At Define & Test step I click on Finish, get confirmation about set up interval and that's all. There are no events and based on _internal logs it didn't even run once after saving. Any idea what could be wrong? Why the script isn't running on schedule? 
We are using coldToFrozenScript to store frozen Index data in GCS. To prove our DR annually we need to restore. This is the first time I have done so at this company and ran into an error that pukes ... See more...
We are using coldToFrozenScript to store frozen Index data in GCS. To prove our DR annually we need to restore. This is the first time I have done so at this company and ran into an error that pukes out when I run the rebuild command, however, I will say that the data appears to show up in Splunk and is searchable. So, I'm wondering is this error something that can be dismissed, or is it something that I should pay attention to?   WARN IndexConfig - Home path size limit cannot accommodate maximum number of hot buckets with specified bucket size because homePath.maxDataSizeMB is too small. Please check your index configuration: idx=linux maxDataSize=750 MB, homePath.maxDataSizeMB=800 MB The indexes.conf for this index is as follows: [linux] repFactor = auto homePath = volume:indexvol001/$_index_name/db coldPath = volume:cold/$_index_name/colddb thawedPath = $SPLUNK_DB/linux/thaweddb tstatsHomePath=volume:_splunk_summaries/$_index_name/datamodel_summary/ frozenTimePeriodInSecs = 31536000 homePath.maxDataSizeMB = 800 maxTotalDataSizeMB = 491789400 maxWarmDBCount = 285
is it possible to record user sessions? I mean a video record.  and correlate it with :   Windows users behavior Linux users behavior DB (both SQL and Oracle) behavior – query execution, proce... See more...
is it possible to record user sessions? I mean a video record.  and correlate it with :   Windows users behavior Linux users behavior DB (both SQL and Oracle) behavior – query execution, procedures, index, table, user, creation/dropping m
Hello, Checking out this answer is helpful, except if the value of the column is multivalue. How do you remove the blue hyperlink color and background from an image when you click on it?   <html>... See more...
Hello, Checking out this answer is helpful, except if the value of the column is multivalue. How do you remove the blue hyperlink color and background from an image when you click on it?   <html> <style> #tableWithDrilldown2 table tbody tr td,#tableWithDrilldown2 table thead th a{color: white !important;} </style> </html> <table id="tableWithDrilldown2">   This is the value of one of the columns in our dashboard table.   | eval events=case( pt="1-acc",TIME." ".dur." ".pt." Code=".cde." ".certStatus." for ".user, pt="2-pay",TIME." ".dur." ".pt." ".user." Code=".cde." ".week." $".gross." Bal: $".bal, pt="3-new",TIME." ".dur." ".pt." ".user." Code=".cde)   This is an example of the output of only two users. The combination of events can be in different orders and have 1, 2, or all 3 of the types of events (acc, pay, new). NOTE: The time between the set of the events of the two users is a coincidence, and not common.   19:35:09.3 10.7 1-acc Code=0000 Success for Jessie 19:35:19.2 09.8 3-new Jessie Code=2801 19:36:56.3 01:37.1 2-pay Jessie Code=0000 03-27-2021 $0 Bal: $17250 19:45:09.3 10.7 1-acc Code=0000 Success for Billie 19:45:19.2 09.8 3-new Billie Code=2801 19:46:56.3 01:37.1 2-pay Billie Code=0000 03-27-2021 $0 Bal: $17250     Thanks and God bless, Genesius
Hi need to calcualte duration bettween each Out/In where A=A+100 B=B IDS=IDS 00:03:02.067 app catZZ_DDP_AP: O[host]A[1000]B[123456]IDS[123456789987] 00:03:02.110 app catZZ_DDP_AP: I[host]A[1100]B[... See more...
Hi need to calcualte duration bettween each Out/In where A=A+100 B=B IDS=IDS 00:03:02.067 app catZZ_DDP_AP: O[host]A[1000]B[123456]IDS[123456789987] 00:03:02.110 app catZZ_DDP_AP: I[host]A[1100]B[123456]IDS[123456789987] expected output: duration                          B                            IDS 00:00:00.043      123456     123456789987 Any idea? Thanks
Hi, I want to create a Correlation alert that will trigger and collect all the events from the same IP within a certain time. I try to "group by", but, not work   THX    
Hi  Is it possibe show decimal numbers on sandkey diagram? e.g my spl command produce this number 0.13 but on sandey diagram just show 0 Any Idea? Thanks
I am trying to filter out null values from the result of stats. Query looks like below.     index=someindex* some ((somefield1=value1 AND somefield2="value2") AND (somefield1=value3 ... See more...
I am trying to filter out null values from the result of stats. Query looks like below.     index=someindex* some ((somefield1=value1 AND somefield2="value2") AND (somefield1=value3 OR (somefield2=value4 AND somefield1=value5 ) ) ) OR (somefield1=value6) | eval someeval=... | replace "some*" with "SOME" in somefield1 | bucket _time span=1d as daytime | stats max(eval(if(somefield1=value1,_time,null()))) as val1_time min(eval(if(somefield1=value2,_time,null()))) as val2_time min(eval(if(somefield1=value3 ,_time,null()))) as val3_time by somefield3 somefield4 | eval recovered_time=if(isNotNull(val2_time),val2_time,val3_time) | where isNotNull(val1_time)     But this query returns result with null or empty val1_time also. What could be the issue in this query? I further pass the result of this query to another stats query.  But I am stuck here.
Hi, Am trying to do an index time masking where my data is not in _raw but in a separate field A. For example A field has the following data "Path=/LoginUser Query=CrmId=ClientABC& ContentItemId=T... See more...
Hi, Am trying to do an index time masking where my data is not in _raw but in a separate field A. For example A field has the following data "Path=/LoginUser Query=CrmId=ClientABC& ContentItemId=TotalAccess&SessionId=3A1785URH117BEA&Ticket=646A1DA4STF896EE& SessionTime=25368&ReturnUrl=http://www.clientabc.com, Method=GET,IP=209.51.249.195, Content=", ""  I have applied transforms rules as below, [session-anonymizer] SOURCE_KEY = field:A REGEX = (?m)^(.*)SessionId=\w+(\w{4}[&"].*)$ FORMAT = $1SessionId=########$2 DEST_KEY = field:A The problem is when we give the DEST_KEY as _raw it is masked properly, But I need the masked data back to field A. How do we get this masked to field:A I have also tried adding  [accepted_keys] is_valid = field:A
Hi How can I hide "code" row from output of lookup comand ? .... | lookup myfile.csv code OUTPUT description FYI: i have some stats before lookup so don't want use "table" command.   Any idea? ... See more...
Hi How can I hide "code" row from output of lookup comand ? .... | lookup myfile.csv code OUTPUT description FYI: i have some stats before lookup so don't want use "table" command.   Any idea? Thanks,
Is there a way to set permissions for MLTK model files in the local.meta file?
Hi! Thanks for your help. I have a question. All this in Dashboard Studio.   I need to add a digital clock (hh:mm:ss) to the dashboard, that looks nice and shows me the time in real-time. Als... See more...
Hi! Thanks for your help. I have a question. All this in Dashboard Studio.   I need to add a digital clock (hh:mm:ss) to the dashboard, that looks nice and shows me the time in real-time. Also, the dashboard is updated every minute, and we need to show the time (hh:mm:ss) it was updated in another panel (We don't want to use ShowLastUpdated code) Regards!
Hi, I have a radio button with 3 choice values. When any of the radio button is clicked or hovered it should show me some message. Can you please help me with the code? Example: When hovered/click... See more...
Hi, I have a radio button with 3 choice values. When any of the radio button is clicked or hovered it should show me some message. Can you please help me with the code? Example: When hovered/clicked on TR Details it should show message as 'TR' and similarly When hovered/clicked on TR DUE it should show message as 'DUE' Below is my radio button code <input type="radio" id="landscape" token="TR"> <label>Landscape</label> <choice value="TR Details">TR Details</choice> <choice value="TR DUE">TR DUE</choice> <change> <condition label="TR Details"> <set token="TR view">TR view</set> <unset token="TR DUE">TR DUE</unset> </condition> <condition label="TR DUE"> <set token="TR DUE">TR DUE</set> <unset token="TR view">TR view</unset> </condition> </change> </input>
I have about 10 indexers, a cluster. For some reason my "master node" turned off and when it turned on. my data has disappeared. there were 18 million data, and it became 9 million for what reason co... See more...
I have about 10 indexers, a cluster. For some reason my "master node" turned off and when it turned on. my data has disappeared. there were 18 million data, and it became 9 million for what reason could this happen? I can't find anything in the logs. HELP PLS
Need help for the below, The sourcetypes has different values in it like below,  index=a sourcetype=b |eval details=1 | append [|search index=c sourcetype=d|eval details=2] | append [|search index... See more...
Need help for the below, The sourcetypes has different values in it like below,  index=a sourcetype=b |eval details=1 | append [|search index=c sourcetype=d|eval details=2] | append [|search index=e sourcetype=f|eval details=3] |eventstats count by details| Pass%=count(pass)/total*100,2 Fail%=count(fail)/total*100,2 Error%=count(Error)/total*100,2 |table pass fail error total I have a barchart with x-axis with details and y-axis %(pass%,fail%,error%) of ( pass fail error etc).When i click the details(x-axis) in barchart , the single value should show number of individual Pass,fail,error in trellis. Please let me know how this can be achieved .