All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello Splunk experts, I would like to know is there an API which can access all events which are generating in Splunk irrespective of search? Please suggest! Thank you in advance. Regards, Eshwar... See more...
Hello Splunk experts, I would like to know is there an API which can access all events which are generating in Splunk irrespective of search? Please suggest! Thank you in advance. Regards, Eshwar 
Hi,  Is this been resolved? Would like to know the solution.
Hi @danielcj, Here are the configuration: [WinHostMon://Service] interval = 600 disabled = 0 type = Service index = windows After execute "splunk list inputstatus" on the UF, I could not fou... See more...
Hi @danielcj, Here are the configuration: [WinHostMon://Service] interval = 600 disabled = 0 type = Service index = windows After execute "splunk list inputstatus" on the UF, I could not found splunk-winhostinfo.exe (WinHostMon://Service) running
We currently use a User service account to bind with Splunk for LDAP authorization. Is there a way to use Active Directory Managed Service Accounts instead to reduce the overhead of maintaining pass... See more...
We currently use a User service account to bind with Splunk for LDAP authorization. Is there a way to use Active Directory Managed Service Accounts instead to reduce the overhead of maintaining passwords?
Hi all, I’m a Splunk beginner, I want to show and hide corresponding pie charts using check box. Can someone please guide me on how to achieve this? Any help or example queries would be greatly appre... See more...
Hi all, I’m a Splunk beginner, I want to show and hide corresponding pie charts using check box. Can someone please guide me on how to achieve this? Any help or example queries would be greatly appreciated. Thank You!
@richgalloway  So I just have to create an index wit the same name on the indexers?
It sounds like the new index was created on the HF, but not on the indexers.  The index must exist on the indexers so they have a place to store the data.
Thanks!! @gcusello @ITWhisperer 
Does anyone know how to invoke a macro on Splunk Cloud using Rest API?  I am using following command but it always returns the output as "No matching fields exist." . I am able to run the same mac... See more...
Does anyone know how to invoke a macro on Splunk Cloud using Rest API?  I am using following command but it always returns the output as "No matching fields exist." . I am able to run the same macro directly from Splunk Search Page and it does return results. curl -k -u uswer:"password" -k https://company.splunkcloud.com:8089/services/search/v2/jobs/export -d exec_mode=oneshot -d search="\`lastLoginStatsByUserProd(userid,7)\`" -d output_mode=json
Do you have some custom extraction in this sourcetype that is preventing Splunk from automatically extract these fields?  With the exception of a typo in your data sample (Filed_Type should be Field_... See more...
Do you have some custom extraction in this sourcetype that is preventing Splunk from automatically extract these fields?  With the exception of a typo in your data sample (Filed_Type should be Field_Type as the other rows), the following is an emulation   | makeresults | eval data = split("Field-Type=F_Type_1,.....,Section=F_Type_1_Value Field-Type=F_Type_2,.....,Section=F_Type_2_Value Field-Type=F_Type_3,.....,Section=F_Type_3_Value", " ") | mvexpand data | rename data AS _raw | extract ``` data emulation above ```   Note the extract is implied in most sourcetypes. Field_Type Section _raw _time F_Type_1 F_Type_1_Value Field-Type=F_Type_1,.....,Section=F_Type_1_Value 2024-02-13 16:15:12 F_Type_2 F_Type_2_Value Field-Type=F_Type_2,.....,Section=F_Type_2_Value 2024-02-13 16:15:12 F_Type_3 F_Type_3_Value Field-Type=F_Type_3,.....,Section=F_Type_3_Value 2024-02-13 16:15:12 Are you not getting fields Field_Type and Section (which in your illustration of desired results is just Field-Value)?  There should be no regex needed. (Also, regex is not the best tool for this rigidly formatted data.) If you already get Field_Type and Section, the following will give you what you illustrated:   | sort host _time | rename Field_Type as Field-Type, Section as Field-Value | table _time host Field-Type Field-Value  
What does the "1d@d" for span mean? I'm just speculating that you want to count calendar days, not arbitrary 24-hour periods from the time of your search.  If not, lose that @d. (The "@" notatio... See more...
What does the "1d@d" for span mean? I'm just speculating that you want to count calendar days, not arbitrary 24-hour periods from the time of your search.  If not, lose that @d. (The "@" notation is called "snap-to".  See Specify a snap to time unit.)
Using index _ad in subsearch to limit _network output will definitely improve performance; in fact, that's exactly in the suggestion that I scrapped because it was mathematically different from your ... See more...
Using index _ad in subsearch to limit _network output will definitely improve performance; in fact, that's exactly in the suggestion that I scrapped because it was mathematically different from your original search.  Hence Question #2: "Your original search has a common field name count in both outer search and subsearch... My (previous) search gives the count of matching events ONLY.  Which count is needed?"  If you are only counting matching events, your new search should work.  Does it perform well enough?  Or is there still mathematical problems?  As a general rule, if a search meets the need of the current use case, defer any optimization. By the way, you do not need | format (with no option).  Splunk optimization will drop it silently, anyway.  The most common use of format is to help user verify whether a subsearch will produce the desired search strings. (Another use is to fine tune subsearch output; but this cannot be achieve with no option.)
Hi, we had deployed cloud flare ta app on one of our sh,could anyone help me in fixing the logs parsing issue in splunk. App link splunkbase.splunk.com/app/5114 Thanks
Database logs  on a dashboard is not showing in splunk. Is there anything i can do to make it work 
I am having this same issue were you able to resolve it? If so, what steps did you take?
Greetings! We are trying to generate a table after we got output from a Splunk query. We are trying pipe (|) this to our query but do not know how to do this. Can someone assist?  This is the outpu... See more...
Greetings! We are trying to generate a table after we got output from a Splunk query. We are trying pipe (|) this to our query but do not know how to do this. Can someone assist?  This is the output after we ran our Splunk query, Feb 13 20:36:21 hostname1 sshd[100607]: pam_unix(sshd:session): session opened for user user123 by (uid=0) Feb 13 20:36:23 hostname2 sshd[100608]: pam_unix(sshd:session): session opened for user user345 by (uid=0) We want to capture the table in this form, Time                                   Hosts                       Users Feb 13 20:36:21       hostname1                user123 Feb 13 20:36:23       hostname2                user345 And so on.. How do we do this. Thank you in advance!
I need some help updating the mmdb file for the iplocation command. Ive read the other forum questions regarding this, as well as the docs, and i am a bit confused.    I initially uploaded the new ... See more...
I need some help updating the mmdb file for the iplocation command. Ive read the other forum questions regarding this, as well as the docs, and i am a bit confused.    I initially uploaded the new mmdb file from MaxMind, the GeoLite2-City.mmdb. I uploaded it through the GeoIP panel through the lookups tab.    It uploads, but i cant seem to find the file afterwards. I am looking on the specific server that I uploaded the file to, we have a clustered environment, but that one specific server I uploaded it to should have it. I ran locate and find commands, but could not locate it. We still have the original under $SPLUNK_HOME$/share/dbip-city-lite.mmdb   Even though the dropbox for the mmdb file showed a successful upload, I can not find it anywhere.  I dont see any trace of the upload through splunkd, or through /export/opt/splunk/var/run/splunk/upload/ , or through any find or locate command.  I wanted to update the file path to include both databases, and i know i needed to change the limits.conf file, and update it to include both paths. But the question is, How do i change the limits.conf so that it replicates. We dont have any app named TA-geoisp or anything similar, and thats what these forums and docs reference.   Somewhere I saw that I could update the search app's limits.conf and just push that from the shcluster directory, as that will push a bundle change that will push out to all Search heads in the cluster. Since the search app is the default app, we could just use that app to point to the mmdb files. But we don't have the search app located under our /$SPLUNK_HOME$/etc/shcluster/apps/   We dont seem to have the search app under our Clustermaster/Deployer shcluster directory. I think i might be missing something. I would basically just like to update the limits.conf to point to the new dir path of both of the mmdb files. Id like to just edit the limits.conf to look like:     [iplocation] MMDBPaths = /path/to/your/GeoIP2-City.mmdb,/path/to/your/dbip-city-lite.mmdb       The question im trying to ask here, is when i upload the file through the gui, where does the file end up. And if i wanted to push these changes manually,  if i wanted to push to all SH and indexers from the deployer and deployment server, how do i go about replicating the folder that holds the mmdb as well as the limits.conf that hold the paths to the files.    Thank you for any assistance.   
Correct. This is applicable for 9.1.0 and above.
  Still trying to only get the Russian IPs. Still pulls the private IPs.      
I am relatively new to the Splunk coding space so bare with me in regards to my inquiry. Currently I am trying to create a table, each row would have the _time, host, and a unique field extracted fr... See more...
I am relatively new to the Splunk coding space so bare with me in regards to my inquiry. Currently I am trying to create a table, each row would have the _time, host, and a unique field extracted from the entry: _Time   Host                         Field-Type       Field-Value 00:00    Unique_Host_1   F_Type_1        F_Type_1_Value 00:00    Unique_Host_1   F_Type_2        F_Type_2_Value 00:00    Unique_Host_1   F_Type_3        F_Type_3_Value 00:00    Unique_Host_2   F_Type_1        F_Type_1_Value 00:00    Unique_Host_2   F_Type_2        F_Type_2_Value 00:00    Unique_Host_2   F_Type_3        F_Type_3_Value .. The data given for each server: Field-Type=F_Type_1,.....,Section=F_Type_1_Value Field-Type=F_Type_2,.....,Section=F_Type_2_Value Filed-Type=F_Type_3,.....,Section=F_Type_3_Value  I have created 3 field extractions for F-Type Values: (.|\n)*?\bF_Type_1.*?\b Section=(?<F_Type_1_Value>-?\d+) This is what I have done so far for the table: index="nothing" source-type="nothing" | first( F_Type_1) by host I am not sure this is the best approach, and I can also refine the field extraction if needed. Generally, my thought process follows: Source | Obtain first entries for all the hosts | Extract fields values | Create table But I am currently hitting a road block in the syntax to create rows for each of the unique Field-Types and their value.