All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi community, I have the need to exclude AIX logs containing a certain field value. This is the regex the parser is using to extract vendor_action filed:   ^\w+\s+\w+\s+\d+\s+\d+\:\d+\:\d+\s+\d+\... See more...
Hi community, I have the need to exclude AIX logs containing a certain field value. This is the regex the parser is using to extract vendor_action filed:   ^\w+\s+\w+\s+\d+\s+\d+\:\d+\:\d+\s+\d+\s+(?<pid>\d+)\s+(?<ppid>\d+)\s+(?<user>\S+)\s+(?<process>\S+)\s+(?<vendor_action>\S+)\s+(?<status>\S+)     I'm trying to exclude events that contain vedor_action=FILE_Unlink and these are my conf file located on Heavy Forwarder: props.conf   [aix:audit] TRANSFORMS-null= setnull     transforms.conf   [setnull] REGEX = ^\w+\s+\w+\s+\d+\s+\d+\:\d+\:\d+\s+\d+\s+\d+\s+\d+\s+\S+\s+\S+\s+FILE_Unlink\s+\S+ DEST_KEY = queue FORMAT = nullQueue     There are sample logs: the first one should be excluded while the second one no:   Fri Jul 02 10:01:49 2021 34078844 8520050 dbloader rm FILE_Unlink OK Not supported filename /tmp/CSI_ODS_M_SIA__INFO_RILANCIO.txt Fri Jul 02 10:01:46 2021 34930828 4587668 root root lsvg FILE_Unlink OK filename /dev/__pv17.0.34930828     When I restart spunk all logs are excluded, so I think something is wrong with my REGEX even if on regex101 seems to work fine.   Any ideas? Thanks a lot Marta
Hi, I have a folder which has .csv .list .sps .param types of files and I need to index them through inputs.conf. What should i provide in my sourcetype?
Hi,  I’m going to deploy a distributed Splunk system where the licenses are going to be held by the License master. The problem is that there can be no internet connection in the system. Is there a... See more...
Hi,  I’m going to deploy a distributed Splunk system where the licenses are going to be held by the License master. The problem is that there can be no internet connection in the system. Is there any way to manage licenses without internet in this type of systems?
Hi All,  I need help with the below requirement. I am getting data from the service now. I calculated the percentage date wise and showed like the results will be as shown below    DATE         ... See more...
Hi All,  I need help with the below requirement. I am getting data from the service now. I calculated the percentage date wise and showed like the results will be as shown below    DATE                                Severity            Percentage 7/02/2021                        P1                        100  7/02/2021                        P2                        100  7/02/2021                        P3                        100  8/02/2021                        P1                        100  8/02/2021                        P2                        100  8/02/2021                        P3                        100  For the above results, I am good. But I would like to show whenever there is no values P1 or P2 or P3 I would like to display as NA  Kindly help me  Thanks & Regards, Balaji 
I have 3 sources of data A,B and C and they have some common data. Source C is an inputlookup. There are now 2 multiselect fields "INCLUDE Source" AND "EXCLUDE Source". Whichever source I select in... See more...
I have 3 sources of data A,B and C and they have some common data. Source C is an inputlookup. There are now 2 multiselect fields "INCLUDE Source" AND "EXCLUDE Source". Whichever source I select in "INCLUDE Source" then it should append the searched data into the table accordingly and none of the sources must be excluded unless specified in the "EXCLUDE Source" (i.e by default NONE should be present in the "EXCLUDE Source".) I want to use this multiselect feature here in splunk in the following way described below: 1.) By default data from all the sources should be appended after each other and duplicates should be removed. (i.e. "INCLUDE Source" must have value ALL AND "EXCLUDE Source" must have NONE.) 2.) Depending upon the order of included fields in "INCLUDE Source" the data should be appended into the table and depending on the data in the "EXCLUDE Source" the data must be removed from the table. In all cases the duplicates must be removed. I tried using 3 radio buttons using YES and NO as options but I was not able to get the result.
I followed @niketn 's syntax to produce a panel of table cells. My syntax is as follow: <dashboard> <label>Test2</label> <row> <panel> <html> <style> #highlight table... See more...
I followed @niketn 's syntax to produce a panel of table cells. My syntax is as follow: <dashboard> <label>Test2</label> <row> <panel> <html> <style> #highlight table tbody{ display:flex; flex-wrap: wrap; } #highlight table tbody tr{ margin-right:10px; margin-bottom:10px; } #highlight table tbody tr td{ width: 180px; text-align: center; } </style> <div> <div>Clicked Value in Cell (click.value):$click.value$</div> <div>Clicked Cell Values (click.value2):$click.value2$</div> </div> </html> <table id="highlight"> <search> <query>index="omap_heng" | fields host reception Station | sort 0 +host | search reception = * | streamstats count by reception,host reset_on_change=true | eval new_counter=if(reception == 1,0, count) | table host new_counter Station | streamstats max(new_counter) as max_counter by host | sort 0 +host, -max_counter | streamstats first(max_counter) as max_counter by host | where new_counter=max_counter | table Station,max_counter | eval "Max Counter Info"=Station."###".max_counter | fields "Max Counter Info" | makemv "Max Counter Info" delim="###"</query> <earliest>0</earliest> <latest></latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">cell</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <format type="color" field="Max Counter Info"> <colorPalette type="list">[#53A051,#006D9C,#F1813F,#DC4E41]</colorPalette> <scale type="threshold">0,6,11</scale> </format> <format type="number" field="Max Counter Info"> <option name="precision">0</option> </format> <drilldown> <set token="click.value">$click.value$</set> <set token="click.value2">$click.value2$</set> </drilldown> </table> </panel> </row> </dashboard>   I would like to highlight each cells, based on a range on max_counter value (the circled value in my image below). Hence, if my max_counter is between 0 and 6, then green, if between 7 and 11, then orange and if above 11, red. My panel currently looks like this now:   Pls help! thanks!    
Hi All, Splunk users are repeatedly shown as "Your session expired. Log in to return to the system", while they are actively using splunk. Suggested them to clear cache, but is of no use. This is re... See more...
Hi All, Splunk users are repeatedly shown as "Your session expired. Log in to return to the system", while they are actively using splunk. Suggested them to clear cache, but is of no use. This is really frustrating for them as a user point of view. I have 60min specified already in server.conf and web.conf Could you please help me to get into the solution. Regards, Jugabanhi
Hey All, Here is my search index=main event_simpleName=NeighborListIP4 OR event_simpleName=SensorHeartbeat | rex field=NeighborList "(?<MAC1>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3... See more...
Hey All, Here is my search index=main event_simpleName=NeighborListIP4 OR event_simpleName=SensorHeartbeat | rex field=NeighborList "(?<MAC1>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;" | rex field=NeighborList "(?<MAC1>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC2>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;" | rex field=NeighborList "(?<MAC1>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC2>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC3>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;" | rex field=NeighborList "(?<MAC1>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC2>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC3>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC4>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC5>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;" | rex field=NeighborList "(?<MAC1>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC2>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC3>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC4>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC5>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;.*?(?<MAC6>.................)\|\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\|0\|.*;" | eval Combiner = mvappend('MAC1', 'MAC2', 'MAC3', 'MAC4', 'MAC5', 'MAC6') | mvexpand Combiner | dedup Combiner | table Combiner I want to show what is in the Combiner field but not present within the MAC field only inside event_simpleName=SensorHeartbeat MAC=* However both event_simpleName=NeighborListIP4 and event_simpleName=SensorHeartbeat contain the field name MAC. Not sure what is the most efficient way of doing this is, I was attempting to use diff command however no luck. Any help would be much appreciated! Thanks
Hello, In order to protect our server performance and data quality.  I found some customers trying to on board their data by themselves, which cause a lot of operation overheads. How can I prevent f... See more...
Hello, In order to protect our server performance and data quality.  I found some customers trying to on board their data by themselves, which cause a lot of operation overheads. How can I prevent for this? Does the future version of splunk have such feature? Scenario 1: Customer built their own apps installed on their own forwarders with/without new sourcetype name without parsing config and these apps not pushed from our deployment server. How can we detect and block these data instead blocking the whole forwarder? Scenario 2: For both HEC and tcp:udp input, how can we avoid customer not overwritten the index name and sourcetype name which should be used the configured on our side? If we detect the unwanted names we can drop it before going to our indexer.
Hi I am trying to convert a PIE Chart which works and displays correctly over to using a Donut Chart but it doesnt display any values.  Has anyone come across this issue and what was the fix Thank... See more...
Hi I am trying to convert a PIE Chart which works and displays correctly over to using a Donut Chart but it doesnt display any values.  Has anyone come across this issue and what was the fix Thanks  Terry  
As we have disk storage size budgets, would like to know which is better,  2 IDXs with 1RF/1SF cluster, or 2 standalone indexers. We also on separate instances have 1 ES SH, 3 HFs, 1MN/DS.   In te... See more...
As we have disk storage size budgets, would like to know which is better,  2 IDXs with 1RF/1SF cluster, or 2 standalone indexers. We also on separate instances have 1 ES SH, 3 HFs, 1MN/DS.   In terms of performance, will there be any differences?   Thanks.
With your Splunk Enterprise & ES being VMs, how do the Indexes & configs get backup during the VM backups? Are there certain steps need to be taken or monitored daily to make sure all is well? Thank ... See more...
With your Splunk Enterprise & ES being VMs, how do the Indexes & configs get backup during the VM backups? Are there certain steps need to be taken or monitored daily to make sure all is well? Thank u in advance.
We have three cases of wildcard renaming preceding an eval command that result in errors (searches below): In Case 1 we observe a silent error whereby a duplicate field (of the same name!) is creat... See more...
We have three cases of wildcard renaming preceding an eval command that result in errors (searches below): In Case 1 we observe a silent error whereby a duplicate field (of the same name!) is created with a different value, and in Case 2, we have an overt error in eval ("Expected )"). The solution is to employ quotes, but the rules appear different in each case (see below table). In light of this, how should we be thinking about wildcard/bulk renaming in an "as" clause preceding an eval? The apparent rules are summarized in the following table. The first row is modeled after the relevant elements of the final eval statement in the below searches. The red options do not work as expected (either due to created a duplicate field in Text prefix case, or due to an eval error in the Numeric prefix case).  | eval <field> = if( isnotnull(_____), _____,0) Case 1: Text prefix <field> | "<field>" | '<field>' <field> | '<field>' Case 2: Numeric prefix <field> | "<field>" | '<field>' <field> | '<field>' Case 3: Suffix <field> | "<field>" | '<field>' <field> | '<field>'   Reproduce Case 1 with this search (generates duplicate field with value 0):   index=_internal sourcetype=splunkd earliest=-5m@m latest=@m | timechart span=1m c as ct, avg(linecount) as lc by sourcetype | rename ct:* as *_ct | stats sum(*_ct) as *_txtprefix | eval splunkd_txtprefix = if(isnotnull(splunkd_txtprefix),splunkd_txtprefix,0)   Reproduce Case 2 and Case 3 with this search (generates eval error):   index=_internal sourcetype=splunkd earliest=-5m@m latest=@m | timechart span=1m avg(linecount) as lc_100, max(linecount) as lc_200, sum(linecount) as 100_lc, min(linecount) as 200_lc | stats sum(lc_*) as *_numprefix, sum(*_lc) as numsuffix_* | eval 100_numprefix = if(isnotnull(100_numprefix),100_numprefix,0), numsuffix_100 = if(isnotnull(numsuffix_100),numsuffix_100,0)   You can resolve the error by replacing the final line with   | eval 100_numprefix = if(isnotnull('100_numprefix'),'100_numprefix',0), numsuffix_100 = if(isnotnull(numsuffix_100),numsuffix_100,0)  
Hello everyone I have a situation with the KV Store, from the SH cluster nodes I am getting the next message KV Store changed status to failed. An error occurred during the last operation ('getSe... See more...
Hello everyone I have a situation with the KV Store, from the SH cluster nodes I am getting the next message KV Store changed status to failed. An error occurred during the last operation ('getServerVersion', domain: '15', code: '13053'): No suitable servers found (`serverSelectionTryOnce` set): [connection closed calling ismaster on 'servername:8191' Mongod.log ASIO [NetworkInterfaceASIO-Replication-0] Dropping all pooled connections to servername:8191 due to failed operation on a connection REPL_HB [replexec-3] Error in heartbeat (requestId: 6289) to servername:8191, response status: HostUnreachable: No connection could be made because the target machine actively refused it. 2021-07-01T20:35:18.370Z I NETWORK [listener] connection accepted from IP:53020 #1194 (12 connections now open)   There is not issues related with the port because the port 8191 is open and I already update certificate of the server.  I have three SH and any of them have the KV status ready.  Do you any idea what can be happening?   Regards.    
for the new Dashboard Studio, is there a way to add in the options to export CSV or PDF like in the old dashboard app? We only have the option to export .png. 
I am trying to remove logs based on a lookup. This is what I am using:   index=myindex "string_to_search_for" NOT      [inputlookup mylookup       | rename IP as host       | field host]   The... See more...
I am trying to remove logs based on a lookup. This is what I am using:   index=myindex "string_to_search_for" NOT      [inputlookup mylookup       | rename IP as host       | field host]   The end result is to exclude any logs that have the "host" field in the event. My inputlookup returns the correct value but my NOT statement isnt doing anything. I am very new to Splunk so I am sure that I am missing something pretty easy. Thanks for the help!
Can Splunk ES (Enterprise Security) work independent of Splunk Enterprise? I mean, does one have to have Splunk Enterprise to have ES working? My understanding is that ES complements Splunk Enterpris... See more...
Can Splunk ES (Enterprise Security) work independent of Splunk Enterprise? I mean, does one have to have Splunk Enterprise to have ES working? My understanding is that ES complements Splunk Enterprise. Please advise. Thank u in advance.
Hi,   I am looking send an email to user with simple yes/no response which I can then use to handle the case. I know Palo Alto’s soar has some integrations to handle this. Do we have similar func i... See more...
Hi,   I am looking send an email to user with simple yes/no response which I can then use to handle the case. I know Palo Alto’s soar has some integrations to handle this. Do we have similar func in Splunk or any idea on how to implement them.   Thanks in advance 
Hi , I am using a stats command with a "by" time field, but i am not getting the result. If i remove the time field i am getting the desired result.  I want all the fields in my outcome, how to ob... See more...
Hi , I am using a stats command with a "by" time field, but i am not getting the result. If i remove the time field i am getting the desired result.  I want all the fields in my outcome, how to obtain. stats latest(et) as "ET" by status, id, starttime - If i remove this "starttime" i am getting the outcome but i need that also in my outcome. Can i put a separate table? but there also this column is not showing. I want all these fields ET,Status,id,Startime What to do?
Would it be possible to configure SPLUNK UF to scan (/pick) files/data from the server at particular time of a day/week/month....and forward them to SPLUNK Indexer? That means ..... set the frequency... See more...
Would it be possible to configure SPLUNK UF to scan (/pick) files/data from the server at particular time of a day/week/month....and forward them to SPLUNK Indexer? That means ..... set the frequency of UF to Pick/Scan files/data and forward them to SPLUNK indexer. If so, any help how to do that would be highly appreciated.