All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Team, my CSV file contains a field like bellow (1st line in CSV) How can i create transformation for field extraction  "State","Location name","Primary Number" its retrieving a field state an... See more...
Hi Team, my CSV file contains a field like bellow (1st line in CSV) How can i create transformation for field extraction  "State","Location name","Primary Number" its retrieving a field state and Location Expected Fields: State Location name Primary Number
  After Splunk Forwarder installation deployment server is not able to push the configuration to the forwarder or the forwarder is not able to communicate with splunk ... See more...
  After Splunk Forwarder installation deployment server is not able to push the configuration to the forwarder or the forwarder is not able to communicate with splunk      
Hello everyone, I want the Kerio Control technical guide that provides details on how to set up and configure a Syslog server to send logs to Splunk log management systems.
Hello,  I have a table in a dashboard like so  User ID1 ID2 A ABC 123 B DEF 456 C GHJ 789   I have set a dilldown token like so <set token="id1... See more...
Hello,  I have a table in a dashboard like so  User ID1 ID2 A ABC 123 B DEF 456 C GHJ 789   I have set a dilldown token like so <set token="id1">$row.ID1$</set> <set token="id1">$row.ID1$</set> I have a below table that I want to pass these token down. For example, when I click at ABC at ID1, it will pass the token id1 aka value "ABC" to below query, when I click at 456 at ID2  it will pass the token id2 aka value "456" to below query.  The query will be  index=myindex $id1$ (if I click on a value in ID1 column) or index=myindex $id2$ (if I click on a value in ID2 column) I push in the right direction would be very appriciated.  Thank you
Hi All,   I recently found out that my Syslog Server is creating duplicates for all log files. I checked the packets ´pcap´ from one host and it contains unique logs but syslog has duplicates. Ho... See more...
Hi All,   I recently found out that my Syslog Server is creating duplicates for all log files. I checked the packets ´pcap´ from one host and it contains unique logs but syslog has duplicates. How do I prevent syslog from creating duplicate logs? Is there a way to prevent Splunk from ingesting duplicate logs? #syslog #linux #duplicates 
Hello, We attempted to upgrade Splunk OTEL on the cluster using the helm3 upgrade command, but encountered the following error.       Error: UPGRADE FAILED: parse error at (splunk-otel-collector... See more...
Hello, We attempted to upgrade Splunk OTEL on the cluster using the helm3 upgrade command, but encountered the following error.       Error: UPGRADE FAILED: parse error at (splunk-otel-collector/templates/operator/_helpers.tpl:8): unclosed action        
I want to know if the Splunk Machine Learning Toolkit 5.3.1 version is compatible with Splunk 9.1.3 Splunk Machine Learning Toolkit 
I'm practicing auto-lookup. Auto-lookup of vendors_ip.csv has already been successful in my index. Here, I would like to add auto-lookup for the prices.csv file in the same index. The process I fol... See more...
I'm practicing auto-lookup. Auto-lookup of vendors_ip.csv has already been successful in my index. Here, I would like to add auto-lookup for the prices.csv file in the same index. The process I followed uploaded a lookup table, created a lookup definition, and created an automatic lookup, but as a result of searching for index=main, only the prices.csv fields are not visible. The fields of vendors_ip that were previously successful are output. What I'm curious about is whether it is possible to perform multiple automatic lookups on one index in splunk. I would also like to know why the automatic lookup is not working.
I have a macro like this: 1 + if(true(), 1, `myMacro(1)`) And I get an infinite recursion error when I use it in a query like this: | makeresults | eval foo = `myMacro(1)` Error in 'SearchParser... See more...
I have a macro like this: 1 + if(true(), 1, `myMacro(1)`) And I get an infinite recursion error when I use it in a query like this: | makeresults | eval foo = `myMacro(1)` Error in 'SearchParser': Reached maximum recursion depth (100) while expanding macros. Check for infinitely recursive macro definitions.   It seems like the macro call is actually being executed during the "expanding macro" operation, and that causes the infinite recursion. This is unexpected since the recursive macro call is actually in a part of the code that can never execute. I can't imagine any way that a macro could actually use recursion... since the macro "expands" outside of the context of the recursive logic. It will always get a recursion error.  
Hello,  I have some issues where using base search is not working on my dashboard. Interestingly, if I click on the search icon, it will come up with valid search query and it will shows some result... See more...
Hello,  I have some issues where using base search is not working on my dashboard. Interestingly, if I click on the search icon, it will come up with valid search query and it will shows some result. However, on my dashboard itselt it shows "no results found". Below is currently what I have set: <search id="prod_request"> <query>type="request" "request.path"="prod/"</query> <earliest>$timerange.earliest$</earliest> <latest>$timerange.latest$</latest> <sampleRatio>1</sampleRatio> <refresh>10m</refresh> <refreshType>delay</refreshType> </search> <chart> <title>Top 10 request</title> <search base="prod_request"> <query>| stats count by auth.account_namespace | sort - count | head 10 | transpose 0 header_field=auth.account_namespace column_name=account_namespace | eval account_namespace=""</query> </search> <option name="charting.axisTitleX.text">Account Namespace</option> <option name="charting.chart">bar</option> <option name="charting.chart.showDataLabels">all</option> <option name="charting.chart.stackMode">default</option> <option name="charting.drilldown">all</option> <option name="charting.legend.placement">right</option> <option name="charting.seriesColors">[0x1e93c6, 0xf2b827, 0xd6563c, 0x6a5c9e, 0x31a35f, 0xed8440, 0x3863a0, 0xa2cc3e, 0xcc5068, 0x73427f]</option> <option name="refresh.display">progressbar</option> </chart>
Hi, does anyone know if it's possible to replace the hard coded javaHome key in db_connect's dbx_settings.conf file with the java_home environment variable in Windows?   I have an auto patching set-u... See more...
Hi, does anyone know if it's possible to replace the hard coded javaHome key in db_connect's dbx_settings.conf file with the java_home environment variable in Windows?   I have an auto patching set-up in a Windows Splunk heavy forwarder, and every time Java gets upgraded, it crashes the Splunk service.   For example, I'd like to replace this: javaHome = C:\Program Files\java\jdk-17.0.11.9-hotspot With this: javaHome=%java_home%   The above syntax doesn't work, although I'm not sure if it's a syntax or functionality issue. Thanks!
Hello, We have created lookup definitions that use CIDR matching for IPV4 ips and is working as expected.  We are running into issues with IPV6. We are trying to create a lookup definition that doe... See more...
Hello, We have created lookup definitions that use CIDR matching for IPV4 ips and is working as expected.  We are running into issues with IPV6. We are trying to create a lookup definition that does a CIDR lookup on a IPV6 IP.  The lookup file uses CIDR notation.  One example from the file is: 2a02:4780:10::/44 The IP that should match is: 2a02:4780:10:5be5::1 The lookup definition is: CIDR(network)   Are IPV6 CIDR lookups supported?  If not, how can we do the lookup definition to satisfy the requrement? 
I have a customer asking why we have a link describing the new "features" for the version 4.0.3 if this version has never been released, we went from version 4.0.2 to 4.0.4. See the attached file.
Drill down with transpose not working as expected to fetch the row and colomn values, as its not giving me the accurate results, not sure if this is related to transpose. index=wso2 source="/opt/lo... See more...
Drill down with transpose not working as expected to fetch the row and colomn values, as its not giving me the accurate results, not sure if this is related to transpose. index=wso2 source="/opt/log.txt" "Count_Reportings" | fields api-rep rsp_time mguuid | bin _time span=1d | stats values(*) as * by _time, mguuid | eval onesec=if(rsp_time<=1000,1,0) | eval threesec=if(rsp_time>1000 and rsp_time<=3000,1,0) | eval threesecGT=if(rsp_time>3000,1,0) | eval Total = onesec + threesec + threesecGT | stats sum(onesec) as sumonesec sum(threesec) as sumthreesec sum(threesecGT) as sumthreesecGT sum(Total) as sumtotal by api-rep, _time | eval good = if(api-rep="High", sumonesec + sumthreesec, if(api-rep="Medium", sumonesec + sumthreesec, if(api-rep="Low", sumonesec, null()))) | eval per_call=if(api-rep="High", (good / sumtotal) * 100, if(api-rep="Medium" , (good / sumtotal) * 100, if(api-rep="Low" , (good / sumtotal) * 100, null()))) | eval per_cal=round(per_call,2) | timechart span=1d avg(per_cal) by api-rep | eval time=strftime(_time, "%Y-%m-%d") | fields - _time _span _spandays | fillnull value=0 | transpose 0 header_field=time column_name=APIs include_empty=true Below is the output for the above query, when i click on the 99.93 then need to pick GOOD and colomn header 2024-06-30 and pass it in the drilldown query When i click on 99.93 from colomn 2024-06-30 it gives me below output, its not giving me the row values as Good. Below are the drildown tokens. tokClickValue1 = $click.value$ tokClickName1 = $click.name$ tokClickValue2 = $click.value2$ tokClickName2 = $click.name2$ tokApi = $row.APIs$ i want token to fetch header and APIs values to pass it to drilldown query. 
I have this most wired situation, where I use inputs.conf on the UF:       [monitor://C:\Users\xxx\OneDrive - xxx\xxx\Sources\On-Board\Splunk\test\eManager] disabled = 0 index = main sourcetype =... See more...
I have this most wired situation, where I use inputs.conf on the UF:       [monitor://C:\Users\xxx\OneDrive - xxx\xxx\Sources\On-Board\Splunk\test\eManager] disabled = 0 index = main sourcetype = el:PoC:eManager         On the HF (before Indexers) I use: props.conf       # For eManager PoC [el:PoC:eManager] INDEXED_EXTRACTIONS=JSON TIMESTAMP_FIELDS=timestampUtc TZ = UTC SHOULD_LINEMERGE = false AUTO_KV_JSON = false KV_MODE = none TRANSFORMS-sourcetype = change-eManagerSourcetype       transforms.conf       [change-eManagerSourcetype] SOURCE_KEY = MetaData:Sourcetype REGEX = (.*?) FORMAT = sourcetype::el:eManager DEST_KEY = MetaData:Sourcetype         Data get ingested, and it all look ok - EXCEPT when using  this search:       index=main source=*_8.* | rename _indextime as iTime | foreach *Time [ | eval <<FIELD>>=strftime(<<FIELD>>,"%Y-%m-%d %H:%M:%S") ] | stats latest(_time) AS _time count BY index sourcetype       I get this result:       index sourcetype _time count main el:eManager 2024-07-02 19:26:36.000 363 main el:eOperator 2024-06-06 14:02:02.986 198       And when adding sourcetype="el:eManager" or just sourcetype="*" I get this:       index sourcetype _time count main el:eOperator 2024-06-06 14:02:02.986 198         It like sourcetype is kind-of hidden, but not hidden after rename in transforms from: "el:PoC:eManager" to "el:eManager". I can search by index and source and show it, but not use sourcetype anymore in a direct search. Can anyone explain please?
Hello,   I have a dashboard and I'd like to add a submit button because if I change something the search is launch by automatically. I'd like to set everything first in the checkbox then the input... See more...
Hello,   I have a dashboard and I'd like to add a submit button because if I change something the search is launch by automatically. I'd like to set everything first in the checkbox then the input field then launch the search with a submit button. I've tried to add a button but in this case I'm not able to choose the other checkbox options, only the 'Any field'. Could you please help the modification?     <form version="1.1" theme="light"> <label>Multiselect Text</label> <init> <set token="toktext">*</set> </init> <fieldset submitButton="false"> <input type="checkbox" token="tokcheck"> <label>Field</label> <choice value="Any field">Any field</choice> <choice value="category">Group</choice> <choice value="severity">Severity</choice> <default>category</default> <valueSuffix>=REPLACE</valueSuffix> <delimiter> OR </delimiter> <prefix>(</prefix> <suffix>)</suffix> <change> <eval token="form.tokcheck">case(mvcount('form.tokcheck')=0,"category",isnotnull(mvfind('form.tokcheck',"Any field")),"Any field",1==1,'form.tokcheck')</eval> <eval token="tokcheck">if('form.tokcheck'="Any field","REPLACE",'tokcheck')</eval> <eval token="tokfilter">if($form.tokcheck$!="Any field",replace($tokcheck$,"REPLACE","\"".$toktext$."\""),$toktext$)</eval> </change> </input> <input type="text" token="toktext"> <label>Value</label> <default>*</default> <change> <eval token="tokfilter">if($form.tokcheck$!="Any field",replace($tokcheck$,"REPLACE","\"".$toktext$."\""),$toktext$)</eval> </change> </input> </fieldset> <row> <panel> <event> <title>$tokfilter$</title> <search> <query>index=* $tokfilter$</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="refresh.display">progressbar</option> </event> </panel> </row> </form>     Thank you very much in advance!
I am noob in SOC, currently started learning. I use Splunk for practice.. So recently, I installed Universal Forwarder in my VMWare Windows 10 OS and Splunk enterprise in my main desktop. During univ... See more...
I am noob in SOC, currently started learning. I use Splunk for practice.. So recently, I installed Universal Forwarder in my VMWare Windows 10 OS and Splunk enterprise in my main desktop. During universal forwarder installation, I gave my main PC ip in the Deployment host & VM windows ip in Receiving host option (Port was default).  But after installing, my forwarder management is not showing any client in there in Splunk. I tried my main ip in both., changed my IP to static to dynamic, chose Bridged & NAT both network option in VM. Nothing is working. Splunk is not connecting to client. Need urgent help to start practicing!! Hoping the actual solution. Thanks in advance.
this default alert does not work,  | rest splunk_server_group=dmc_group_license_master /services/licenser/pools this does not even return any results. Anyone having ideas?
  | makeresults format=csv data="QUE_NAM,FINAL,QUE_DEP S_FOO,MQ SUCCESS, S_FOO,CONN FAILED, S_FOO,MEND FAIL, S_FOO,,3" | stats sum(eval(if(FINAL=="MQ SUCCESS", 1, 0))) as good sum(eval(if(FINAL=="C... See more...
  | makeresults format=csv data="QUE_NAM,FINAL,QUE_DEP S_FOO,MQ SUCCESS, S_FOO,CONN FAILED, S_FOO,MEND FAIL, S_FOO,,3" | stats sum(eval(if(FINAL=="MQ SUCCESS", 1, 0))) as good sum(eval(if(FINAL=="CONN FAILED", 1, 0))) as error sum(eval(if(FINAL=="MEND FAIL", 1, 0))) as warn avg(QUE_DEP) as label by QUE_NAM | rename QUE_NAM as to | eval from="internal", label="Avg: ".label." Good: ".good." Warn: ".warn." Error: ".error | append [| makeresults format=csv data="queue_name,current_depth BAR_Q,1 BAZ_R,2" | bin _time span=10m | stats avg(current_depth) as label by queue_name | rename queue_name as to | eval from="external", label="Avg: ".label | appendpipe [ stats values(to) as from | mvexpand from | eval to="internal" ]] How to add different icons for each one in the flow map viz. Please help me on that. Thanks in advance. 
Hi, I have a correlation search created in Enterprise security. Scheduled as below. Mode: guided Time range> Earliest: -24h, Latest: Now, Cron: 0 03 * * *, scheduling: realtime, schedule window: au... See more...
Hi, I have a correlation search created in Enterprise security. Scheduled as below. Mode: guided Time range> Earliest: -24h, Latest: Now, Cron: 0 03 * * *, scheduling: realtime, schedule window: auto, priority: auto Trigger alert when greater than 0 Throttling > window duration: 0 Response action > To:mymailid, priority: normal, Include: Link to alert, link to result, trigger condition, attach csv, Trigger time In this case, mail is not getting delivered regularly. If I try executing the same SPL query in search, it showing more than 300 rows result