Hi @RezaET , the issue probably is in the default search path: by default only the main index is in the default search path and in apps the index isn't specified. You have two solutions: add the...
See more...
Hi @RezaET , the issue probably is in the default search path: by default only the main index is in the default search path and in apps the index isn't specified. You have two solutions: add the other indexes in the default search path, add the index in all the searches of your app. Ciao. Giuseppe
Hi Splunkers, I'm working on a React app for Splunk based on the documentation provided. https://splunkui.splunk.com/Packages/create/CreatingSplunkApps I need to hide(may configure) the Splunk hea...
See more...
Hi Splunkers, I'm working on a React app for Splunk based on the documentation provided. https://splunkui.splunk.com/Packages/create/CreatingSplunkApps I need to hide(may configure) the Splunk header bar when the React app is in use. I’ve come across several community posts suggesting that overriding the default view files might be necessary, but I’m unsure how to configure this within a React app. I’d appreciate any guidance on how to achieve this.
index=db_it_network sourcetype=pan* url_domain="www.perplexity.ai" OR app=claude-base OR app=google-gemini* OR app=openai* OR app=bing-ai-base
| eval app=if(url_domain="www.perplexity.ai", url_domain...
See more...
index=db_it_network sourcetype=pan* url_domain="www.perplexity.ai" OR app=claude-base OR app=google-gemini* OR app=openai* OR app=bing-ai-base
| eval app=if(url_domain="www.perplexity.ai", url_domain, app)
| table user, app, _time
| eval week_num = "Week Number" . strftime(_time, "%U")
| stats count by user app week_num
| chart count by app week_num
| sort app 0
According to Windows Export Certificate - Splunk Security Content it using macros in the first query `certificateservices_lifecycle` EventCode=1007
| xmlkv UserData_Xml
| stats count min(_time) a...
See more...
According to Windows Export Certificate - Splunk Security Content it using macros in the first query `certificateservices_lifecycle` EventCode=1007
| xmlkv UserData_Xml
| stats count min(_time) as firstTime max(_time) as lastTime by Computer, SubjectName, UserData_Xml
| rename Computer as dest
| `security_content_ctime(firstTime)`
| `security_content_ctime(lastTime)`
| `windows_export_certificate_filter` And in `certificateservices_lifecycle` macros is (source=XmlWinEventLog:Microsoft-Windows-CertificateServicesClient-Lifecycle-System/Operational OR source=XmlWinEventLog:Microsoft-Windows-CertificateServicesClient-Lifecycle-User/Operational) I only know in SPL we can't get result if write query with source in the first position, so i add index=* before `certificateservices_lifecycle` but unfortunately i don't get any result. Then i'm using metasearch for check it available or not with this query First query : | metasearch index=* source IN ("XmlWinEventLog:Microsoft-Windows-CertificateServicesClient-Lifecycle-System/Operational") Second query : | metasearch index=* source IN ("XmlWinEventLog:Microsoft-Windows-CertificateServicesClient-Lifecycle-User/Operational") and then i only got 0 result. The question is if i want to get data from source="XmlWinEventLog:Microsoft-Windows-CertificateServicesClient-Lifecycle-User/Operational" did i need setting it in Endpoints or can solved it in Splunk ? Thanks
Hi @jaracan , I cannot have it because it depends on the target platform that i don't know: you have to create a script that calls it using its API passing to the script the correlaton search data...
See more...
Hi @jaracan , I cannot have it because it depends on the target platform that i don't know: you have to create a script that calls it using its API passing to the script the correlaton search data or a search on the Notable index. Check if there's an app that permits this export, for some platform (e.g Microsoft Defender) it was already developed. Ciao. Giuseppe
Hi @jaracan , you can create a script that uses the API of the destination platform. Then you can associate this script to a Correlation Search, or schedule al alert that calls this script. Ciao. ...
See more...
Hi @jaracan , you can create a script that uses the API of the destination platform. Then you can associate this script to a Correlation Search, or schedule al alert that calls this script. Ciao. Giuseppe
Hi, Good day! Just wanted to check your insights or any reference that you can share. We have Clustered Multisite Splunk environment and Splunk ES SHC and let's say correlation searches is triggere...
See more...
Hi, Good day! Just wanted to check your insights or any reference that you can share. We have Clustered Multisite Splunk environment and Splunk ES SHC and let's say correlation searches is triggered, it will generate notable events. Is it possible that these notable events will be sent to another platform (example: Google Chronicle). If yes, can you share how it will be done? So let's say, the notable event is generated, it will be stored in Splunk locally and will also be forwarded to Google Chronicle SOAR. Is it possible?
Good day, I have a query to summarize data per week. Is there a way to display my tables in a better way as my dates for the path month would just be the dates in number format? I would like t...
See more...
Good day, I have a query to summarize data per week. Is there a way to display my tables in a better way as my dates for the path month would just be the dates in number format? I would like to name the table Week 1, Week 2, Week 3 etc if possible. index=db_it_network sourcetype=pan* url_domain="www.perplexity.ai" OR app=claude-base OR app=google-gemini* OR app=openai* OR app=bing-ai-base | eval app=if(url_domain="www.perplexity.ai", url_domain, app) | table user, app, _time | stats count by user app _time | chart count by app _time span=1w | sort app 0
I'm encountering an issue in the Forwarder Management Console. When navigating to the GUI and clicking on a server class, the clients are not visible. However, if I click "Add Clients," they become ...
See more...
I'm encountering an issue in the Forwarder Management Console. When navigating to the GUI and clicking on a server class, the clients are not visible. However, if I click "Add Clients," they become visible. Additionally, under the Clients tab, while I can see the clients listed, it doesn't display how many apps are deployed on those clients. The server class indicates that 4 apps are deployed on 30 servers, but the Clients tab shows zero apps deployed on those clients. Can someone provide a proper solution for this issue?
Hi Splunkers, i have been working on a dashboard for that I need the data for last 7 months from jan 2024 to till date when i was searching for the logs it was only showing for the last 3 months dat...
See more...
Hi Splunkers, i have been working on a dashboard for that I need the data for last 7 months from jan 2024 to till date when i was searching for the logs it was only showing for the last 3 months data i.e., from 10, jun to till date and gradually all the logs are disappearing is there any way to fix this... i tried this query | tstats earliest(_time) as first, latest(_time) as last where index=foo | fieldformat first=strftime(first,"%c") | fieldformat last=strftime(last,"%c") the result shows index="my-index" first last Mon Jun 10 04:19:23 2024 Tue Aug 27 07:50:04 2024
Hello, I want to write a suppression in Splunk ES that suppresses an event if a specific process occurs at 11 AM every day. This limitation should be applied to the raw logs because the ES rul...
See more...
Hello, I want to write a suppression in Splunk ES that suppresses an event if a specific process occurs at 11 AM every day. This limitation should be applied to the raw logs because the ES rules execute within a specific time cycle and create notable events. My goal is to suppress the event when the rule runs, but only if the specific process exists at 11 AM. How can I apply this time constraint in the suppression? Can I do this through the search I write? How? How can I implement this time constraint on raw data? I need to limit the time in the raw event.
Hi Team, One of our customer reported that he was finding duplicate records in splunk ( duplicate files and duplicate data in files). We want to simulate the scenario in our lab. If someone can help...
See more...
Hi Team, One of our customer reported that he was finding duplicate records in splunk ( duplicate files and duplicate data in files). We want to simulate the scenario in our lab. If someone can help to write SPL to find duplicate records. Regards, Alankrit