All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

      How to get the exception from the below tables. Exception is John who is not HR table .     User list from the servers.   Name  ID  Bill 23 Peter 24 john  25   HR T... See more...
      How to get the exception from the below tables. Exception is John who is not HR table .     User list from the servers.   Name  ID  Bill 23 Peter 24 john  25   HR Table  Name  ID  Bill  23 Peter  24 Anita 27
Hello all, I installed a Splunk add-on on my heavy forwarder just to test it first, it worked fine. After that I copied it (the entire directory) to the deployment server and I pushed it to the hea... See more...
Hello all, I installed a Splunk add-on on my heavy forwarder just to test it first, it worked fine. After that I copied it (the entire directory) to the deployment server and I pushed it to the heavy forwarder because, you know, I want to manage everything from the deployment server (trying to be organized ) The issue is, from the heavy forwarder GUI, when i click on the app icon it doesn't load: it gives me "500 Internal Server Error" (with the picture of the confused horse ) and I have these error messages from the internal logs:  "ERROR ExecProcessor [2341192 ExecProcessorSchedulerThread] - message from "/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/myapp_hf/bin/app.py" HTTP 404 Not Found -- Action forbidden." I forgot to mention that I changed the name of the original app in app.conf  I can't figure out why it is not working   Thanks for your help, Kaboom1
@richgalloway  Where exactly we can see this debug log setting in the DS?  
Hi,  I'm trying to use the REST API to get and post saved searches that are Alerts but for some reason it only returns data for Reports. Has anyone else had this problem?  GET https://<host>:<mP... See more...
Hi,  I'm trying to use the REST API to get and post saved searches that are Alerts but for some reason it only returns data for Reports. Has anyone else had this problem?  GET https://<host>:<mPort>/services/saved/searches https://<host>:<mPort>/services/saved/searches/{name}
Hi. You can convert your time to epoch values and then subtract them. Here's an example: | makeresults | eval start="10/10/23 23:50:00.031 PM", end="11/10/23 00:50:00.031 AM PM" | eval startepoch=s... See more...
Hi. You can convert your time to epoch values and then subtract them. Here's an example: | makeresults | eval start="10/10/23 23:50:00.031 PM", end="11/10/23 00:50:00.031 AM PM" | eval startepoch=strptime('start',"%m/%d/%y %H:%M:%S.%3N") | eval endepoch=strptime('end',"%m/%d/%y %H:%M:%S.%3N") | eval diff=endepoch-startepoch | eval timediff=tostring(diff,"duration")
@yuanliuYeah, it is a pain of a search.  Here is the issue.  A firewall device generates an event with URL when certain policies are triggered by contractors.  That is the initial search.  The firewa... See more...
@yuanliuYeah, it is a pain of a search.  Here is the issue.  A firewall device generates an event with URL when certain policies are triggered by contractors.  That is the initial search.  The firewall team has a list of the URLs the contractors have access to which is the csv file.  The firewall team wants to remove any URLs that aren't used in a period of time.  Thus, I have to compare the firewall URLs to the csv URLs and output any csv URLs that aren't used in the time frame.  The search finds the firewall events.   index=my_index sourcetype=my_sourcetype (rule=policy_1 OR rule=policy_2 OR rule=policy_3) [ | inputlookup my_list_of_urls.csv ]   My issue is the firewall events use the long URL and not the short one.  From the firewall a478076af2deaf28abcbe5ceb8bdb648.fp.measure.office.com/ aad.cs.dds.microsoft.com/ From the csv file *.microsoft.com/ microsoft.com/ *.office.com/ office.com/ The two events from the firewall mean that the two listed in the csv file are still good and don't need to be removed.  I try to think of this as two sets, one the firewall results and the other the csv file, but I can't figure out how to search the firewall results with what is in the csv file. This make sense? TIA, Joe  
Hi Team, Is there any way we can calculate time duration between 2 different events like start and end. For example: we have start event at 10/10/23 23:50:00.031 PM, and End evet at 11/10/23 0... See more...
Hi Team, Is there any way we can calculate time duration between 2 different events like start and end. For example: we have start event at 10/10/23 23:50:00.031 PM, and End evet at 11/10/23 00:50:00.031 AM  how can we calculate this. please help. Thank you
Thank you for your help for this question Can you also help this related question?    Thank you so much https://community.splunk.com/t5/Splunk-Search/How-to-calculate-total-when-aggregating-using-s... See more...
Thank you for your help for this question Can you also help this related question?    Thank you so much https://community.splunk.com/t5/Splunk-Search/How-to-calculate-total-when-aggregating-using-stats-max-field/m-p/660403#M227978
How to calculate total when aggregating using stats max(field)? Thank you for your help.  Max Total Score is the total score of maximum score for each Score field when aggregating all rows using ... See more...
How to calculate total when aggregating using stats max(field)? Thank you for your help.  Max Total Score is the total score of maximum score for each Score field when aggregating all rows using stats: max(Score1), max(Score2), max(Score3).    TotalScore is the total of each Score field for each row (without aggregation)  This is the output I need Class Name Subject TotalScore Score1 Score2   Score3 Max TotalScore ClassA grouped grouped 240 85 95 80 260 My Splunk Search   | index=scoreindex | stats values(Name) as Name, values(Subject) as Subject, max(TotalScore) as TotalScore, max(Score1) as Score1, max(Score2) as Score2, max(Score3) as Score3 by Class | table Class Name, Subject, Total Score, Score1, Score2, Score3   I think my search below is going to display the following. Class Name Subject TotalScore Score1 Score2   Score3 ClassA Name1 Name2 Name3 Math English 240 85 95 80 This is the whole data in table format from scoreindex Class Name Subject TotalScore Score1 Score2   Score3 ClassA Name1 Math 170 60 40 70 ClassA Name1 English 195 85 60 50 ClassA Name2 Math 175 50 60 65 ClassA Name2 English 240 80 90 70 ClassA Name3 Math 170 40 60 70 ClassA Name3 English 230 55 95 80
at all, I have to parse logs extracted from logstash. I'm receiving logstash logs and they are in json format and almost all the fields I need are already parsed and available in json. My issue is... See more...
at all, I have to parse logs extracted from logstash. I'm receiving logstash logs and they are in json format and almost all the fields I need are already parsed and available in json. My issue is that the event rawdata is in a field called "message" and these fields aren't automatically extracted as I would. I'd like to avoid to re-parse all datasources and create custom add-ons from all data sources. Does anybody encounter this kind of integration and know a way to use standard Add-Ons to parse only the message field? Thank you for your help. Ciao. Giuseppe
Have you consulted resources like these? Using threat intelligence in Splunk Enterprise Security  Unified App for ES: Enrich and submit notable events - Splunk Intel Management (TruSTAR) 
We use the ansible-role-for-splunk framework found on GitHub: https://github.com/schneewe/ansible-role-for-splunk It support app deployments through the following task: https://github.com/schneewe/a... See more...
We use the ansible-role-for-splunk framework found on GitHub: https://github.com/schneewe/ansible-role-for-splunk It support app deployments through the following task: https://github.com/schneewe/ansible-role-for-splunk/blob/master/roles/splunk/tasks/configure_apps.yml But this seem to require a full Search Head Cluster but we only have a single search head node. Isn't the single search head setup supported by this framework or am I just missing something?
I will for sure! I thank you for the time you all have dedicated to this. When this will be done I'll share my experience with you for any further feedback. Best regards
If the app is updating itself then it should write to the bin or local directory.  Local is preferred so the changes are not overwritten when the app is updated.
Please heed the note at the top of the file. # DO NOT EDIT THIS FILE! # Please make all changes to files in $SPLUNK_HOME/etc/apps/Splunk_TA_windows/local. # To make changes, copy the section/stanza ... See more...
Please heed the note at the top of the file. # DO NOT EDIT THIS FILE! # Please make all changes to files in $SPLUNK_HOME/etc/apps/Splunk_TA_windows/local. # To make changes, copy the section/stanza you want to change from $SPLUNK_HOME/etc/apps/Splunk_TA_windows/default # into ../local and edit there. Any changes made to a default file will be lost when a new version of the app is installed.  All changes should be made in a local file.
Hi I am using the same source type on the same file. One is coming in via forwarder and the other is uploaded via GUI. However, the forwarder is not extracting the fields. This means I have to us... See more...
Hi I am using the same source type on the same file. One is coming in via forwarder and the other is uploaded via GUI. However, the forwarder is not extracting the fields. This means I have to use "patch" to access the fields, this is a pain. Below is a file from a forwarder, we can see fields are not extracted. Below is the same file but upload - in this case, the fields are extracted. This is the sourcetype [import_json_2] DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true TIMESTAMP_FIELDS = start_time TZ = Asia/Beirut category = Structured description = JavaScript Object Notation format. For more information, visit http://json.org/ disabled = false pulldown_type = 1   Any ideas - thanks in advance. Rob
The app is the same, and the configuration is also common. Is there any other folder where we can put the app  ensure that the app default folder config files are loaded? (Without using etc/system/l... See more...
The app is the same, and the configuration is also common. Is there any other folder where we can put the app  ensure that the app default folder config files are loaded? (Without using etc/system/local)
Hi All  In my current dashboard i have several text input that colleagues can use to find varies information. Sometimes it takes a while for their information to appear.  Is there a way to add a lo... See more...
Hi All  In my current dashboard i have several text input that colleagues can use to find varies information. Sometimes it takes a while for their information to appear.  Is there a way to add a loading notification / alert to advise colleagues that Splunk is retrieving the information but may take some time?  The delay unusually is only for their 1st search and thereafter the searches are pretty much instant.  Many thanks   Paula  
The point of the SHC Deployer is to ensure all SHC members have the same configuration.  If there is a need for unique configurations then they will have to be done manually (or perhaps using Ansible... See more...
The point of the SHC Deployer is to ensure all SHC members have the same configuration.  If there is a need for unique configurations then they will have to be done manually (or perhaps using Ansible or something similar).
Hi @ramkyreddy, what exactly is your requirement? <your_search> | eval sku = if(name="",substr(kit,0,5),substr(name,0,5)) | eval sku=case(sku="NAC-D","ANT-P",sku="DHV-K","ABD-U",true(),sku) the se... See more...
Hi @ramkyreddy, what exactly is your requirement? <your_search> | eval sku = if(name="",substr(kit,0,5),substr(name,0,5)) | eval sku=case(sku="NAC-D","ANT-P",sku="DHV-K","ABD-U",true(),sku) the search should work.  Ciao. giuseppe