All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @gcusello , Do you have sample template/script you mentioned or any reference link? That would be helpful. Thank you
Hi @jaracan , you can create a script that uses the API of the destination platform. Then you can associate this script to a Correlation Search, or schedule al alert that calls this script. Ciao. ... See more...
Hi @jaracan , you can create a script that uses the API of the destination platform. Then you can associate this script to a Correlation Search, or schedule al alert that calls this script. Ciao. Giuseppe
Hi, Good day! Just wanted to check your insights or any reference that you can share. We have Clustered Multisite Splunk environment and Splunk ES SHC and let's say correlation searches is triggere... See more...
Hi, Good day! Just wanted to check your insights or any reference that you can share. We have Clustered Multisite Splunk environment and Splunk ES SHC and let's say correlation searches is triggered, it will generate notable events. Is it possible that these notable events will be sent to another platform (example: Google Chronicle). If yes, can you share how it will be done? So let's say, the notable event is generated, it will be stored in Splunk locally and will also be forwarded to Google Chronicle SOAR. Is it possible?
Good day, I have a query to summarize data per week. Is there a way to display my tables in a better way as my dates for the path month would just be the dates in number format?  I would like t... See more...
Good day, I have a query to summarize data per week. Is there a way to display my tables in a better way as my dates for the path month would just be the dates in number format?  I would like to name the table Week 1, Week 2, Week 3 etc if possible. index=db_it_network sourcetype=pan* url_domain="www.perplexity.ai" OR app=claude-base OR app=google-gemini* OR app=openai* OR app=bing-ai-base | eval app=if(url_domain="www.perplexity.ai", url_domain, app) | table user, app, _time | stats count by user app _time | chart count by app _time span=1w | sort app 0
Version:9.2.1Build:78803f08aabb
I'm encountering an issue in the Forwarder Management Console. When navigating to the GUI and clicking on a server class, the clients are not visible. However, if I click "Add Clients," they become ... See more...
I'm encountering an issue in the Forwarder Management Console. When navigating to the GUI and clicking on a server class, the clients are not visible. However, if I click "Add Clients," they become visible. Additionally, under the Clients tab, while I can see the clients listed, it doesn't display how many apps are deployed on those clients. The server class indicates that 4 apps are deployed on 30 servers, but the Clients tab shows zero apps deployed on those clients. Can someone provide a proper solution for this issue?
Hi @sarvesh_11 , what do you mean with attach? you can read and index a csv file, not an xlsx file. Ciao. Giuseppe
Hi Splunkers, i have been working on a dashboard for that I need the data for last 7 months from jan 2024 to till date when i was searching for the logs it was only showing for the last 3 months dat... See more...
Hi Splunkers, i have been working on a dashboard for that I need the data for last 7 months from jan 2024 to till date when i was searching for the logs it was only showing for the last 3 months data i.e., from 10, jun to till date and gradually all the logs are disappearing is there any way to fix this... i tried this query  | tstats earliest(_time) as first, latest(_time) as last where index=foo | fieldformat first=strftime(first,"%c") | fieldformat last=strftime(last,"%c") the result shows index="my-index"                 first                                                       last  Mon Jun 10 04:19:23 2024     Tue Aug 27 07:50:04 2024
Hi @fahimeh , a suppression rule is a search that you can build as you need, containing also the time rules. Ciao. Giuseppe
Hello, I want to write a suppression in Splunk ES that suppresses an event if a specific process occurs at 11 AM every day. This limitation should be applied to the raw logs because the ES rul... See more...
Hello, I want to write a suppression in Splunk ES that suppresses an event if a specific process occurs at 11 AM every day. This limitation should be applied to the raw logs because the ES rules execute within a specific time cycle and create notable events. My goal is to suppress the event when the rule runs, but only if the specific process exists at 11 AM. How can I apply this time constraint in the suppression? Can I do this through the search I write? How? How can I implement this time constraint on raw data? I need to limit the time in the raw event.  
Facing the same problem, I notice the issue is a year old..
Hi Team, One of our customer reported that he was finding duplicate records in splunk ( duplicate files and duplicate data in files). We want to simulate the scenario in our lab. If someone can help... See more...
Hi Team, One of our customer reported that he was finding duplicate records in splunk ( duplicate files and duplicate data in files). We want to simulate the scenario in our lab. If someone can help to write SPL to find duplicate records.   Regards, Alankrit
Thank you for the suggestion. I will try this simpler way. But I wanted to avoid a possible situation where the given pattern would appear in another place. For example, if I encounter a pattern af... See more...
Thank you for the suggestion. I will try this simpler way. But I wanted to avoid a possible situation where the given pattern would appear in another place. For example, if I encounter a pattern after a sixth or seventh comma, it's not my case.  I'm not sure that this situation can really exist but I don't know how to check
Hi @irkey , you have two choices: use a macro, as hinted by @KendallW , use an eventtype containing the search parameters, for more infos see at https://docs.splunk.com/Documentation/Splunk/9.3.0/... See more...
Hi @irkey , you have two choices: use a macro, as hinted by @KendallW , use an eventtype containing the search parameters, for more infos see at https://docs.splunk.com/Documentation/Splunk/9.3.0/Knowledge/Abouteventtypes in this way if you created an evenntype called e.g. "somefield" containing  somefield IN (a,b,c,d), you can call it using  eventtype=somefield Ciao. Giuseppe
Hi @wm , don't use crcSalt = <SOURCE> in your inputs.conf. Ciao. Giuseppe
That's getting more complicated. Simple forwarding of all data to multiple groups is... simple. You define two or more output groups and send everything everywhere. If you want to send all data from... See more...
That's getting more complicated. Simple forwarding of all data to multiple groups is... simple. You define two or more output groups and send everything everywhere. If you want to send all data from specific input to a given output(s), you can use the _TCP_ROUTE setting in input definition (see spec for inputs.conf). But if you want to send only selected events to specific destinations, you're left with https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Routeandfilterdatad It needs a HF - UF doesn't do parsing.
@guillermomolina There is a link at the bottom of every one of their emails - https://discover.splunk.com/subscription.html 
I signed up to Splunk or Storm and had to accept the commercial emails to finalize my sign-up. How to unsubscribe to the emails ?  
@irkey Put them in a search macro - https://docs.splunk.com/Documentation/SplunkCloud/latest/Knowledge/Usesearchmacros