All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Anyone able to successfully run Independent Stream Forwarder on Fedora or Debian? I have inherited a small stand-alone, bare-metal Splunk Enterprise 9.1.2 running on Fedora 39. I'm trying to point a ... See more...
Anyone able to successfully run Independent Stream Forwarder on Fedora or Debian? I have inherited a small stand-alone, bare-metal Splunk Enterprise 9.1.2 running on Fedora 39. I'm trying to point a Netflow stream at the ISF installed on this same server but I'm getting blank screens in Distributed Forwarder Manager and Configure Streams on the Splunk Stream App that is also installed on the same server. Thank you!
I fixed it! It was not the capabilities that were at fault, it was the curl command. the documentation says to use the following to create an index: curl -k -u editor-user:MyPasword1 https://loca... See more...
I fixed it! It was not the capabilities that were at fault, it was the curl command. the documentation says to use the following to create an index: curl -k -u editor-user:MyPasword1 https://localhost:8089/servicesNS/admin/myapp/data/indexes -d name=newindex The REST API call is asking to make changes in the admin namespace, but the indexes are in the nobody namespace, so I needed to change it to be this and then it worked: curl -k -u editor-user:MyPasword1 https://localhost:8089/servicesNS/nobody/myapp/data/indexes -d name=newindex
Glad you figured it out!!!
Hello @gcusello , I don't want to specify a particular index for each sourcetype, but I do want the host to send these logs to a specific index. The sourcetypes I have include WinEventLog:Security... See more...
Hello @gcusello , I don't want to specify a particular index for each sourcetype, but I do want the host to send these logs to a specific index. The sourcetypes I have include WinEventLog:Security, WinEventLog:System, WinEventLog:Application et ActiveDirectory Monitoring.
Hello @ITWhisperer, Thank you so much for sharing that with me. But it's not only for Col7 and Col10 all time. It might be within any Column and also # of Backslashes might not always be the same fo... See more...
Hello @ITWhisperer, Thank you so much for sharing that with me. But it's not only for Col7 and Col10 all time. It might be within any Column and also # of Backslashes might not always be the same for the same Column. In that case how would I address that issue  
This is not the search log. You get a search log when you click the "Job" button and pick "Inspect Job". On top of the screen you'll have a link to search log.
I tried using | table data but there is still no data when searching the index.  With the spath removed and the _time field added there are still no events in the index and the search log ca... See more...
I tried using | table data but there is still no data when searching the index.  With the spath removed and the _time field added there are still no events in the index and the search log can be seen below. There is still data coming through the search.    
Hi @BRFZ , at first, why do you want to send logs to a different index? index is usually choosed based on two parameters: retention and access grants. An index isn't a database table in which put ... See more...
Hi @BRFZ , at first, why do you want to send logs to a different index? index is usually choosed based on two parameters: retention and access grants. An index isn't a database table in which put omogenous logs, if you have all logs with the same retention period and the same access grants, you could also put them in the same index. The parameter that define a data flow is the sourcetype not the index! Anyway, how do you take logs from Active Directory? if you have a dedicated input in inpus.conf, enable it and assign the different index. if you want to send to this different index part of the wineventogs, you cannot do this on the Universal Forwarder, but you have to override the index values on Indexers, but as I said, it's unuseful! Ciao. Giuseppe
Hello, I have successfully configured the Splunk Universal Forwarder on a Windows machine to send WinEventLog: System, Security, and Application logs to a specific index. Now, I need to include logs... See more...
Hello, I have successfully configured the Splunk Universal Forwarder on a Windows machine to send WinEventLog: System, Security, and Application logs to a specific index. Now, I need to include logs from sourcetype = 'ActiveDirectory'. Could you please guide me through the necessary steps to specify the index for Active Directory logs in the configuration files inputs.conf [WinEventLog://Application] disabled=0 index = test  
Hi @VatsalJagani , Thanks for your input, and sorry for my late response. Actually we finally figured it out, and I frankly never saw this before, but I think/hope it makes sense. The issue wa... See more...
Hi @VatsalJagani , Thanks for your input, and sorry for my late response. Actually we finally figured it out, and I frankly never saw this before, but I think/hope it makes sense. The issue was was reuse of events from same log file, from where several events was copied and pasted into several test log source files with different names, but the same event was pasted to more that one file. Of cause this would cause duplicated events, but from different source files, why I'd expected this to work right while building and testing a new parser (props and transforms), but NO After we made sure the same lines from the core log file was only used once in a new test log file, all was good again. Lessons learned
What rules have you got configured in $SPLUNK_HOME/etc/apps/SA-ITOA/default/itsi_rules_engine.properties?
I am creating a script that uses the CLI to create/delete Splunk roles. So far, I have been successful with creating them in the script when I use the admin user. However, my CISO says that I can't ... See more...
I am creating a script that uses the CLI to create/delete Splunk roles. So far, I have been successful with creating them in the script when I use the admin user. However, my CISO says that I can't use the Splunk admin user and I need to create a Splunk User (and a Splunk Role) that can create and delete indexes. I have tried adding the indexes_edit capability and when I tried doing the delete as my user, Splunk said that I needed to have the list_inputs capability. i have also tried adding access to all indexes. I am using this document at the moment for my guidance, but it is rather light on detail: https://docs.splunk.com/Documentation/Splunk/latest/Security/Rolesandcapabilities The command that i am running is: curl -k -u editor-user:MyPasword1 https://localhost:8089/servicesNS/admin/myapp/data/indexes -d name=newindex I get the following: <response>   <messages>     <msg type="ERROR">Action forbidden.</msg>   </messages> </response> This command succeeds if I use the admin user, but not with my editor user. The current capabilities that I have to my existing editor role are:   [role_editor] admin_all_objects = disabled edit_roles = enabled indexes_edit = enabled list_inputs = enabled srchIndexesAllowed = * srchMaxTime = 8640000 srchTimeEarliest = -1 srchTimeWin = -1   Does anyone know what extra capabilities I need, please?
If the apps are defined in a custom app (a Best Practice) then edit savedsearches.conf to put the alerts in another app.  Then upload and install both apps. Otherwise, you can use the REST API to do... See more...
If the apps are defined in a custom app (a Best Practice) then edit savedsearches.conf to put the alerts in another app.  Then upload and install both apps. Otherwise, you can use the REST API to do the job.  See https://docs.splunk.com/Documentation/Splunk/9.2.2/RESTREF/RESTsearch#saved.2Fsearches.2F.7Bname.7D First, I'd try modifying the eai:acl.app setting, but I'm not sure that's supported.  If it works, you're golden and just need to loop through a list of searches to move. If that doesn't work then you're stuck with copy-and-delete.  Read the specs for each search, create a copy of it in the destination app, then delete the original.
I also tried just with 7 days Backfill, but sadly got the same results.  7 days is also the minimum Splunk ITSI offers as a backfill option.  If you write  backfill for 1 day, do you mean to ma... See more...
I also tried just with 7 days Backfill, but sadly got the same results.  7 days is also the minimum Splunk ITSI offers as a backfill option.  If you write  backfill for 1 day, do you mean to manually fill the itsi_summary index? As in not going via the GUI of Splunk ITSI? 
Another possibility is to try | table data instead of fields
So, with the spath removed and the _time field added, do you events now show up in the index (after the collect command)? If not, what does the search log say for the search with the collect command?
Because i don't see the events in from searching the index I created, I can't tell the timestamp. I added  | eval _time=now() to the query so that it would put the recent time when the endpoint... See more...
Because i don't see the events in from searching the index I created, I can't tell the timestamp. I added  | eval _time=now() to the query so that it would put the recent time when the endpoint was reached
so there is a field in the log named data and that's where i need my log from. I also removed spath command from the query. Removing the collect command still shows the log curled from the endpoint.
   
It works like a charm!! Thank you!! EDIT----- No. Sorry My bad. I realized I forgot to mention that timestamps are not always at :00. I've updated the question. Would you have any suggestion on... See more...
It works like a charm!! Thank you!! EDIT----- No. Sorry My bad. I realized I forgot to mention that timestamps are not always at :00. I've updated the question. Would you have any suggestion on how to round use your solution to keep into account the minutes? Thank you