All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Experts, I am encountering an issue  with using filter tokens in specific row on my dashboard. I have two filters named ABC and DEF, tokens represented for ABC is $abc$ and DEF is $def$.  I ... See more...
Hi Experts, I am encountering an issue  with using filter tokens in specific row on my dashboard. I have two filters named ABC and DEF, tokens represented for ABC is $abc$ and DEF is $def$.  I want to pass these tokens only to one specific row, while for others, I want to reject them.  For the rows where i need to pass the tokens, I've used the following syntax: <row depends="$abc$ $def$"></row> For the row where i don't want to use the token, I've used the following syntax; <row rejects="$abc$ $def$"></row>. However when i use the rejects condition, the rows are hidden. I want these rows to still be visible. Could someone please advise on how to resolve this issue?  I would appreciate every help. Thank you in advance!
I have a relatively simple query that counts HTTP 404 events in IIS logs. I wanted to sort them according to which hosts had the highest individual count, however the "highcount" field is always blan... See more...
I have a relatively simple query that counts HTTP 404 events in IIS logs. I wanted to sort them according to which hosts had the highest individual count, however the "highcount" field is always blank. (I probably need to also sort by host, but that's irrelevant to the eventstats issue.)   index=iis status=404 uri="*/*.*" |stats count by host uri |eventstats max(count) by host as highcount |sort -highcount -count |table highcount count host uri    
i'm having a thought , that a text input box which has a dropdown and also able to enter text input with a single token, i am not sure this will works , needed guidance Thanks in Advance. Sanja... See more...
i'm having a thought , that a text input box which has a dropdown and also able to enter text input with a single token, i am not sure this will works , needed guidance Thanks in Advance. Sanjai S
Dear Splunkers, I would like to ask your feedback on following issues with the Service Now add-on app. The problem is that I´m not able display settings for the Add-on where I need to select differ... See more...
Dear Splunkers, I would like to ask your feedback on following issues with the Service Now add-on app. The problem is that I´m not able display settings for the Add-on where I need to select different Service now accounts configured succefully.   What it should look like is following (taken from different project): This is where I can choose my preferred account and configure details.  Can you suggest what could be the reason not to see these settings with my admin user role account ? Thanks in advance. BR
Hello, I have started a Cloud Trial to create a test environment for a connector that I wanted to test for a customer. This connector requires additional ports to be opened to allow data ingestion f... See more...
Hello, I have started a Cloud Trial to create a test environment for a connector that I wanted to test for a customer. This connector requires additional ports to be opened to allow data ingestion from Azure Event Hub. This should be configures using the ACS API. I've enabled the token authentication from the portal, and I generated a new token. I then try to configure Postman to use the API, and I've setup a new request to test the API access:    https://admin.splunk.com/{{stack}}/adminconfig/v2/status   Where {{stack}} represents my instance name defined at the collection level, and bearer token configured in the authorization tab. However, when executing the request, it loops for approximately 30 seconds to a minute before resulting in the following error message:     { "code": "500-internal-server-error", "message": "An error occurred while processing this request. Trying this request again may succeed if the bug is transient, otherwise please report this issue this response. (requestID=426a14b3-97e3-968a-a924-f3abc4300795). Please refer to https://docs.splunk.com/Documentation/SplunkCloud/latest/Config/ACSerrormessages for general troubleshooting tips." }   Despite my efforts, this error has persisted for over 24 hours, and I have no ideas about what can be the issue root cause. Could anyone advice on how to address this issue and successfully configure the necessary settings? Any assistance would be greatly appreciated. Thank you.
Hi all, I would like to visualize a person's schedule as well as show the moment of when events took place. The visualization is to make apparent whether the events took place within or outside the ... See more...
Hi all, I would like to visualize a person's schedule as well as show the moment of when events took place. The visualization is to make apparent whether the events took place within or outside the person's working hours. I'm stumped at how to tackle this.  Anyone know which visualization type to use? Perhaps also any pointers on how to prepare the data?  Data example: EmployeeID    work_from      work_until      event_timestamp 123                    08:00                 17:00                16:30 123                    08:00                 17:00                01:00 Below is a quick sketch of what I would like to end up with. The green bars show the working hours, so on Monday this person is working from 14:00 - 24:00 and has an event at 23:00. On Tuesday the person is not working but still has 3 events.
As an MSSP overseeing 10 distinct Splunk customers, I'm on the lookout for a solution that allows me to efficiently monitor license usage and memory consumption across all clients. Since these cus... See more...
As an MSSP overseeing 10 distinct Splunk customers, I'm on the lookout for a solution that allows me to efficiently monitor license usage and memory consumption across all clients. Since these customers don't communicate with each other, I'm considering setting up a dedicated Splunk instance to collect and consolidate these logs. Any recommendations for apps that can help achieve this seamlessly, or perhaps an alternative approach that you've found effective in similar scenarios? Your insights would be greatly appreciated! Thanks in advance. #SplunkMonitoring #MSSPChallenges #splunkenterprise #monitoringConsole
Hello everyone. I experienced a cyberattack on my computer, and the Avast Firewall detected and alerted me to pop-up messages. I intend to report the incident to the police for investigation, and the... See more...
Hello everyone. I experienced a cyberattack on my computer, and the Avast Firewall detected and alerted me to pop-up messages. I intend to report the incident to the police for investigation, and they require me to have a log file containing details of this attack. The hard drive's Windows boot trail has been corrupted, leaving me only able to access the folders and files as an external drive. I need assistance in locating the correct folder and file of the cyber attack evidence. I possess a screenshot displaying the Avast warning message, which occurred on January 4, 2024. thank you  Daniel 
I have the index=fortigate and there are two sourcetypes ("fgt_event" and "fgt_traffic"). index=fortigate sourcetype=fgt_event |stats count by user, assignip user assignip john 192.168.1.... See more...
I have the index=fortigate and there are two sourcetypes ("fgt_event" and "fgt_traffic"). index=fortigate sourcetype=fgt_event |stats count by user, assignip user assignip john 192.168.1.1 paul 192.168.1.2   index=fortigate soucetype=fgt_traffic | stats count by src srcport dest destport src srcport dest destport 192.168.1.1 1234 10.0.0.1 22 192.168.1.2 4321 10.0.0.2 22 I want to correlate the result like_ user src (or) assignip srcport dest destport john 192.168.1.1 1234 10.0.0.1 22 paul 192.168.1.2 4321 10.0.0.2 22 I have learned SPL query like join,  mvappend, coalesce, subsearch ,etc. I tried a lot by combining the SPL functions to output. It doesn't still working. Please help me. Thanks.
Hello I'm trying to pass a list of dicts from a "custom code block" into a "filter block", to run either ip_lookup, hash_lookup, or both sub-playbooks based on the indicator type. For example: i... See more...
Hello I'm trying to pass a list of dicts from a "custom code block" into a "filter block", to run either ip_lookup, hash_lookup, or both sub-playbooks based on the indicator type. For example: ioc_list = [     {         "ioc": "2.2.2.2",         "type": "ip"     },     {         "ioc": "1.1.1.1",         "type": "ip"     },     {         "ioc": "ce5761c89434367598b34f32493b",         "type": "hash"             } ]   And then on filter I have: if get_indicators:custom_function:ioc_list.*.type == ip     run -> ip_lookup sub-playbook if get_indicators:custom_function:ioc_list.*.type == hash     run -> hash_lookup sub-playbook     And it looks like the filter does half of the job, because it can route to the proper sub-playbook(s), but instead of forwarding only the elements that match the conditions, it simply forwards all elements.     Expected output: filtered-data on condition_1 route:  [ {         "ioc": "2.2.2.2",         "type": "ip"     },     {         "ioc": "1.1.1.1",         "type": "ip" }]   filtered-data on condition_2 route:  [{         "ioc": "ce5761c89434367598b34f32493b",         "type": "hash"         }]   Actual output on both condition routes: [{         "ioc": "2.2.2.2",         "type": "ip"     },     {         "ioc": "1.1.1.1",         "type": "ip"     },     {         "ioc": "ce5761c89434367598b34f32493b",         "type": "hash" }]     Even though this seems a specific question, is also part of a broad miss-understanding of how custom code blocks and filter interact with each other.   Hope some one can enlighten me in the correct path. Thanks  
Splunk SOAR (On-premises) installs with a default license, the Community License. The Community License is limited to: 100 licensed actions per day 1 tenant 5 cases in the New or Open states ... See more...
Splunk SOAR (On-premises) installs with a default license, the Community License. The Community License is limited to: 100 licensed actions per day 1 tenant 5 cases in the New or Open states If the quota of 5 cases is already reached, will I still be assigned new cases? Alternatively, can new cases only be assigned once the previous 5 cases have been resolved?
Hi all, I'm struggling with problem that I can't find any error logs in Asset and Identity Management dashboard in Splunk Enterprise Security. It shows NOT FOUND and I see the error message behind i... See more...
Hi all, I'm struggling with problem that I can't find any error logs in Asset and Identity Management dashboard in Splunk Enterprise Security. It shows NOT FOUND and I see the error message behind is "You need edit_modinput_manager capability to edit information." . But I'm an admin that already have this permission.  Hope anyone can tell me how can I fix this error.  Thank you.   
We had a problem with our syslog server and a bunch of data went missing in the ingest. The problem was actually caused by the UF not being able to keep up with the volume of logs before the logrotat... See more...
We had a problem with our syslog server and a bunch of data went missing in the ingest. The problem was actually caused by the UF not being able to keep up with the volume of logs before the logrotate process compressed the files, making them unreadable. I caught this in progress and began making copies of the local files so that they would not get rotated off the disk. I am looking for a way to put them back into the index in the correct place in _time. I thought it would be easy but it is turning out harder than I expected.  I have tried making a Monitor inputs for a local file, and cat/printf the log file into the monitored file. I have also tried to use the "add oneshot" cli command, neither way has gotten me what I am wanting. The monitored file kind of works, and I think I could probably make it better given some tweeking.  The "add oneshot" command actually works very well and it is the first time I am learning about this useful command. My problem I believe is that the sourcetype I am using is not working as intended. I can get data into the index using the oneshot command and it looks good, as far as breaks the lines into events, etc. The problem I am seeing is all the parsing rules that are included with the props/transforms in the Splunk_TA_paloalto addon are not being applied effectively. Splunk is parsing some fields but I suspect it is guessing based on the format of the data. When I look at the props.conf for the TA, I see it uses a general stanza called [pan_log] but inside the config will transform the sourcetype into a variety of different sourcetypes based on the type of log in the file (there is at least 6 possibilities).   TRANSFORMS-sourcetype = pan_threat, pan_traffic, pan_system, pan_config, pan_hipmatch, pan_correlation, pan_userid, pan_globalprotect, pan_decryption   When I use the oneshot command, the data goes into the index and I can find it by specifying the source, but none of this transforms is happening, so the logs are not separated into the final sourcetypes.  Has anybody ran into a problem like this and know a way to make it work? Or have any other tips that I can try to make some progress on this? One thing I was thinking is that the Splunk_TA_paloalto addon is located on the indexers, but not on the server that has the files that I am doing the oneshot comamnd from. I expected this would all be happening on the indexer tier, but maybe I need to add it locally so splunk knows how to handle the data.  Any ideas?
Why can I not find any documentation on Firewall Investigator module of Splunk Enterprise?
Hello all,  I wanted to share my recently published Splunk extension for Chrome and Edge. This extension is free and enables several features: Code commenting with Ctrl + / Comment collapsing/fol... See more...
Hello all,  I wanted to share my recently published Splunk extension for Chrome and Edge. This extension is free and enables several features: Code commenting with Ctrl + / Comment collapsing/folding Saving and retrieving queries Inline command help You can check it out at Splunk Search Assistant (google.com) Julio
Is there a way to change the _time field of imported data to be a custom extracted datetime field? Or at least some way to specify a different field used by the time picker? I have seen some so... See more...
Is there a way to change the _time field of imported data to be a custom extracted datetime field? Or at least some way to specify a different field used by the time picker? I have seen some solutions use props.conf but I am on Splunk Cloud 
Netapp products whch are running DataONTAP are being transitioned from ZAPI to REST.  Support for ZAPI wil be dropped in future OnTAP releases. Since the Splunk TA uses ZAPI, does it also support RE... See more...
Netapp products whch are running DataONTAP are being transitioned from ZAPI to REST.  Support for ZAPI wil be dropped in future OnTAP releases. Since the Splunk TA uses ZAPI, does it also support REST?  If it does not currently use REST, are there plans to deliver a future version which does? Thanks.
|tstats count where index=app-idx host="*abfd*" sourcetype=app-source-logs by host This is my alert query, i want to modify the query so that i wont receive alert at certain times. For example: Eve... See more...
|tstats count where index=app-idx host="*abfd*" sourcetype=app-source-logs by host This is my alert query, i want to modify the query so that i wont receive alert at certain times. For example: Every month like on 10 , 18, 25 and during 8am to 11am i don't want to get the alerts. Rest all for other days its should work as normal. how can i do it???
When writing regex, where in the regex string am I supposed to add the (?<new_field>) string ? I have included a sample regex string below, where in this string would I add (?<new_field>) ? (?<=\:\... See more...
When writing regex, where in the regex string am I supposed to add the (?<new_field>) string ? I have included a sample regex string below, where in this string would I add (?<new_field>) ? (?<=\:\[)(.*)(?=\]) Thanks !
Hi All! Hope all is well. I am about to pull my hair out trying to override a sourcetype for a specific set of tcp network events. The event starts with the same string of 'acl_policy_name' and it ... See more...
Hi All! Hope all is well. I am about to pull my hair out trying to override a sourcetype for a specific set of tcp network events. The event starts with the same string of 'acl_policy_name' and it is currently being labeled with a sourcetype of 'f5:bigip:syslog'. I want to override that sourcetype with a new one labeled 'f5:bigip:afm:syslog' however, even after modifying the props and transforms conf files: still no dice. I used regex101 to ensure that the regex for the 'acl_policy_name' match is correct but I've gone through enough articles and Splunk documentation to no avail. Nothing in the btools outputs for it looks out of place or as though it could be interfering with the settings below. Any thoughts or suggestions would be greatly appreciated before I throw my laptop off a cliff. Thanks in advance! Event Snippet: Inputs.conf [tcp://9515] disabled = false connection_host = ip sourcetype = f5:bigip:syslog index = f5_cs_p_p Props.conf [f5:bigip:syslog] TRANSFORMS-afm_sourcetype = afm-sourcetype *Note I also tried [source::tcp:9515] as a spec instead of the sourcetype but no dice either way. Transforms.conf [afm-sourcetype] REGEX = ^acl_policy_name="$ DEST_KEY = MetaData:Sourcetype FORMAT = sourcetype::f5:bigip:afm:syslog WRITE_META = true