All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I have installed the latest splunk with Splunk enterprise security on it. I have worked with enterprise security before, and there were some filters available to filter incidents, now in this versio... See more...
I have installed the latest splunk with Splunk enterprise security on it. I have worked with enterprise security before, and there were some filters available to filter incidents, now in this version 7.3.0 there are no filters,    Is there anything wrong I am doing?  
Hi @Niro, If your issue isn't resolved, it might happen because of sourcetype overwrite on pan logs.  pan:traffic is overridden sourcetype please try putting the transforms setting to your original... See more...
Hi @Niro, If your issue isn't resolved, it might happen because of sourcetype overwrite on pan logs.  pan:traffic is overridden sourcetype please try putting the transforms setting to your original sourcetpe. It should be pan:log or pan_log according to your input setting. [pan:log] TRANSFORMS-pan_user = pan_src_user
Hi did you get this working?
Thanks @kamlesh_vaghela 
@Muthu_Vinith    Are you looking like this?  XML <form version="1.1" theme="dark"> <label>Chechbox</label> <fieldset submitButton="false"> <input type="checkbox" token="checkbox_a"> ... See more...
@Muthu_Vinith    Are you looking like this?  XML <form version="1.1" theme="dark"> <label>Chechbox</label> <fieldset submitButton="false"> <input type="checkbox" token="checkbox_a"> <label></label> <choice value="panel_a">Panel A</choice> <delimiter> </delimiter> <change> <condition match="$checkbox_a$==&quot;panel_a&quot;" > <set token="tkn_panel_a">1</set> </condition> <condition> <unset token="tkn_panel_a"></unset> </condition> </change> </input> <input type="checkbox" token="checkbox_b"> <label></label> <choice value="panel_b">Panel B</choice> <delimiter> </delimiter> <change> <condition match="$checkbox_b$==&quot;panel_b&quot;" > <set token="tkn_panel_b">1</set> </condition> <condition> <unset token="tkn_panel_b"></unset> </condition> </change> </input> </fieldset> <row> <panel depends="$tkn_panel_a$"> <title>Panel One $checkbox_a$</title> <chart> <search> <query>| makeresults | eval a=100</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="charting.chart">pie</option> <option name="charting.drilldown">none</option> <option name="refresh.display">progressbar</option> </chart> </panel> <panel depends="$tkn_panel_b$"> <title>Panel Two $checkbox_b$</title> <chart> <search> <query>| makeresults | eval a=100</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="charting.chart">pie</option> <option name="charting.drilldown">none</option> <option name="refresh.display">progressbar</option> </chart> </panel> </row> </form>       I hope this will help you.   Thanks KV If any of my replies help you to solve the problem Or gain knowledge, an upvote would be appreciated.
Hello Splunk experts, I would like to know is there an API which can access all events which are generating in Splunk irrespective of search? Please suggest! Thank you in advance. Regards, Eshwar... See more...
Hello Splunk experts, I would like to know is there an API which can access all events which are generating in Splunk irrespective of search? Please suggest! Thank you in advance. Regards, Eshwar 
Hi,  Is this been resolved? Would like to know the solution.
Hi @danielcj, Here are the configuration: [WinHostMon://Service] interval = 600 disabled = 0 type = Service index = windows After execute "splunk list inputstatus" on the UF, I could not fou... See more...
Hi @danielcj, Here are the configuration: [WinHostMon://Service] interval = 600 disabled = 0 type = Service index = windows After execute "splunk list inputstatus" on the UF, I could not found splunk-winhostinfo.exe (WinHostMon://Service) running
We currently use a User service account to bind with Splunk for LDAP authorization. Is there a way to use Active Directory Managed Service Accounts instead to reduce the overhead of maintaining pass... See more...
We currently use a User service account to bind with Splunk for LDAP authorization. Is there a way to use Active Directory Managed Service Accounts instead to reduce the overhead of maintaining passwords?
Hi all, I’m a Splunk beginner, I want to show and hide corresponding pie charts using check box. Can someone please guide me on how to achieve this? Any help or example queries would be greatly appre... See more...
Hi all, I’m a Splunk beginner, I want to show and hide corresponding pie charts using check box. Can someone please guide me on how to achieve this? Any help or example queries would be greatly appreciated. Thank You!
@richgalloway  So I just have to create an index wit the same name on the indexers?
It sounds like the new index was created on the HF, but not on the indexers.  The index must exist on the indexers so they have a place to store the data.
Thanks!! @gcusello @ITWhisperer 
Does anyone know how to invoke a macro on Splunk Cloud using Rest API?  I am using following command but it always returns the output as "No matching fields exist." . I am able to run the same mac... See more...
Does anyone know how to invoke a macro on Splunk Cloud using Rest API?  I am using following command but it always returns the output as "No matching fields exist." . I am able to run the same macro directly from Splunk Search Page and it does return results. curl -k -u uswer:"password" -k https://company.splunkcloud.com:8089/services/search/v2/jobs/export -d exec_mode=oneshot -d search="\`lastLoginStatsByUserProd(userid,7)\`" -d output_mode=json
Do you have some custom extraction in this sourcetype that is preventing Splunk from automatically extract these fields?  With the exception of a typo in your data sample (Filed_Type should be Field_... See more...
Do you have some custom extraction in this sourcetype that is preventing Splunk from automatically extract these fields?  With the exception of a typo in your data sample (Filed_Type should be Field_Type as the other rows), the following is an emulation   | makeresults | eval data = split("Field-Type=F_Type_1,.....,Section=F_Type_1_Value Field-Type=F_Type_2,.....,Section=F_Type_2_Value Field-Type=F_Type_3,.....,Section=F_Type_3_Value", " ") | mvexpand data | rename data AS _raw | extract ``` data emulation above ```   Note the extract is implied in most sourcetypes. Field_Type Section _raw _time F_Type_1 F_Type_1_Value Field-Type=F_Type_1,.....,Section=F_Type_1_Value 2024-02-13 16:15:12 F_Type_2 F_Type_2_Value Field-Type=F_Type_2,.....,Section=F_Type_2_Value 2024-02-13 16:15:12 F_Type_3 F_Type_3_Value Field-Type=F_Type_3,.....,Section=F_Type_3_Value 2024-02-13 16:15:12 Are you not getting fields Field_Type and Section (which in your illustration of desired results is just Field-Value)?  There should be no regex needed. (Also, regex is not the best tool for this rigidly formatted data.) If you already get Field_Type and Section, the following will give you what you illustrated:   | sort host _time | rename Field_Type as Field-Type, Section as Field-Value | table _time host Field-Type Field-Value  
What does the "1d@d" for span mean? I'm just speculating that you want to count calendar days, not arbitrary 24-hour periods from the time of your search.  If not, lose that @d. (The "@" notatio... See more...
What does the "1d@d" for span mean? I'm just speculating that you want to count calendar days, not arbitrary 24-hour periods from the time of your search.  If not, lose that @d. (The "@" notation is called "snap-to".  See Specify a snap to time unit.)
Using index _ad in subsearch to limit _network output will definitely improve performance; in fact, that's exactly in the suggestion that I scrapped because it was mathematically different from your ... See more...
Using index _ad in subsearch to limit _network output will definitely improve performance; in fact, that's exactly in the suggestion that I scrapped because it was mathematically different from your original search.  Hence Question #2: "Your original search has a common field name count in both outer search and subsearch... My (previous) search gives the count of matching events ONLY.  Which count is needed?"  If you are only counting matching events, your new search should work.  Does it perform well enough?  Or is there still mathematical problems?  As a general rule, if a search meets the need of the current use case, defer any optimization. By the way, you do not need | format (with no option).  Splunk optimization will drop it silently, anyway.  The most common use of format is to help user verify whether a subsearch will produce the desired search strings. (Another use is to fine tune subsearch output; but this cannot be achieve with no option.)
Hi, we had deployed cloud flare ta app on one of our sh,could anyone help me in fixing the logs parsing issue in splunk. App link splunkbase.splunk.com/app/5114 Thanks
Database logs  on a dashboard is not showing in splunk. Is there anything i can do to make it work 
I am having this same issue were you able to resolve it? If so, what steps did you take?