All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The file ACF2DS_Data.csv comprises columns such as TIMESTAMP, DS_NAME, and JOBNAME. I need to perform a partial match of the LKUP_DSN column from the DSN_LKUP.csv file with the DS_NAME column in the... See more...
The file ACF2DS_Data.csv comprises columns such as TIMESTAMP, DS_NAME, and JOBNAME. I need to perform a partial match of the LKUP_DSN column from the DSN_LKUP.csv file with the DS_NAME column in the ACF2DS_Data.csv file in order to retrieve the relevant events from ACF2DS_Data.csv.
Hi @Cheng2Ready , please, see this my old answer: https://community.splunk.com/t5/Splunk-Search/Bank-holiday-exclusion-from-search-query/m-p/491071 Ciao. Giuseppe
Hi @Dikshi  Please could you check your splunkd.log for any errors relating to Mongo/KVStore and report back. Also - Have you made any changes recently to either Splunk version, permissions, certif... See more...
Hi @Dikshi  Please could you check your splunkd.log for any errors relating to Mongo/KVStore and report back. Also - Have you made any changes recently to either Splunk version, permissions, certificates or operating system? Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
How to solve my mongod.log file is empty 
kvstore featurecompatiability shows an error occured during the last operation ( ‘ get parameter’) domain 15 code 13053 no suitable server found serverselection Timeoutms’ expired[ tls handshake fail... See more...
kvstore featurecompatiability shows an error occured during the last operation ( ‘ get parameter’) domain 15 code 13053 no suitable server found serverselection Timeoutms’ expired[ tls handshake failed error:000000lib(0) :func(0):reason[0]     this above error is showing 
@msatish  have you tried this?   
this is not related to splunk base app, but observability-synthetics module itself. thanks for pointing it out, removed the associated app here!
@ganji What is the ERROR code?
@ganji  Check this https://community.splunk.com/t5/All-Apps-and-Add-ons/Splunk-Add-on-Failed-to-load-Inputs-Page/m-p/631993 
@ganji  1. Check splunkd.log  index=_internal source=*splunkd.log  VT4Splunk OR /opt/splunk/var/log/splunk/splunkd.log | grep -i "VT4Splunk" 2. Uninstall and reinstall the app
OrgID: <enter-orgid> Realm: <enter-realm> Instance Name: <instance-name> Request: Please securely open our Splunk Cloud Platform instance management port (8089) and add the IP addresses of the above ... See more...
OrgID: <enter-orgid> Realm: <enter-realm> Instance Name: <instance-name> Request: Please securely open our Splunk Cloud Platform instance management port (8089) and add the IP addresses of the above realm to our allow list so that we can enable Log Observer Connect. ref: https://docs.splunk.com/observability/en/logs/scp.html#support-ticket So here when im trying to connect log observer with splunk cloud(im in free trial) its telling me to open the managemnt port 8089 of the splunk cloud and for this we have to raise a case with splunk support IS this setup actually recquired?
@Fara7at08  The "Short ID" button might be missing due to changes in the interface or settings during the upgrade. According to the Upgrade Splunk Enterprise Security - Splunk Documentation  After ... See more...
@Fara7at08  The "Short ID" button might be missing due to changes in the interface or settings during the upgrade. According to the Upgrade Splunk Enterprise Security - Splunk Documentation  After upgrading to version 7.0.0 When you upgrade the Splunk Enterprise Security app to versions 7.0.0 or higher, the short IDs for notables that were created prior to the upgrade are not displayed on the Incident Review page. As a workaround, you can recreate all the short IDs that were available prior to the upgrade.
I'm building front end nodejs docker image, it need to install appd. My nodejs version is: v22 linux/amd64 appd version is: 24.12.0 but after the image built, then i try to start the server, then ... See more...
I'm building front end nodejs docker image, it need to install appd. My nodejs version is: v22 linux/amd64 appd version is: 24.12.0 but after the image built, then i try to start the server, then got below error: Appdynamics agent cannot be initialized due to Error: /appdynamics/node_modules/appdynamics-libagent-napi/appd_libagent.node: cannot open shared object file: No such file or directory Error:/node_modules/appdynamics/node_modules/appdynamics-libagent-napi/appd_libagent.node: cannot open shared object file: No such file or directory at Module._extensions..node (node:internal/modules/cjs/loader:1717:18) at Module.load (node:internal/modules/cjs/loader:1317:32) at Module._load (node:internal/modules/cjs/loader:1127:12) at TracingChannel.traceSync (node:diagnostics_channel:315:14) at wrapModuleLoad (node:internal/modules/cjs/loader:217:24) at Module.require (node:internal/modules/cjs/loader:1339:12) at require (node:internal/modules/helpers:125:16) at Module._compile (node:internal/modules/cjs/loader:1546:14) I checked and confirm the appd_libagent.node already under 
so looks like I was able to solve this issue but mapping "${user:groups}" to the role attribute and creating Splunk SAML groups with the name of the Group ID from the IdP group.
There is no need to test.  Splunk will only parse an event as JSON if the *entire* event is nothing but pure well-formed JSON.  It can't parse part of the event or extract a field and parse that.  Of... See more...
There is no need to test.  Splunk will only parse an event as JSON if the *entire* event is nothing but pure well-formed JSON.  It can't parse part of the event or extract a field and parse that.  Of course, you can do those things yourself in a query, but Splunk won't do it automatically.
Hi everyone, I am using IAM identity Center as the IdP for SAML auth. I have 2 groups in the IdP and 2 SAML groups in Splunk with differing roles and the groups in the IdP contain different users.  ... See more...
Hi everyone, I am using IAM identity Center as the IdP for SAML auth. I have 2 groups in the IdP and 2 SAML groups in Splunk with differing roles and the groups in the IdP contain different users.  My issue is I am unable to work out how to assign a user to a group in Splunk based on the users group in the IdP. I can hard set the role attribute to the group name or even both group names and this will result in all users receiving the referenced Splunk group's role regardless of what group they are assigned to in the IdP.    Does anyone know how to resolve this issue or if there is a user attribute for group?
When I open the configuration page, VT4Splunk app throws an error. I am on Splunk 9.2.2
Hi @Cheng2Ready  To implement the desired behavior for muting alerts following holidays based on your holiday dates, you can modify your Splunk query to handle the special case where the holiday fal... See more...
Hi @Cheng2Ready  To implement the desired behavior for muting alerts following holidays based on your holiday dates, you can modify your Splunk query to handle the special case where the holiday falls on a Friday. Here's a revised version of your query that checks for Friday holidays and adjusts the day to mute alerts: index=<search> | eval Date=strftime(_time, "%Y-%m-%d") | lookup holidays.csv HolidayDate as Date output Holiday | eval should_alert = if(isnull(Holiday), "Yes", "No") | eval day_of_week = strftime(_time, "%A") // Get the day of the week | eval mute_date = if(day_of_week == "Friday", Date + 3*86400, Date + 86400) // Mute for Friday holidays | eval mute_alert = if(mute_date == Date, "No", should_alert) // Adjust mute based on the calculated mute date | table Date mute_alert | where mute_alert = "Yes" Explanation: Day of the Week Calculation: `strftime(_time, "%A")` retrieves the day of the week for the given date. Mute Date Calculation: The line: `eval mute_date = if(day_of_week == "Friday", Date + 3*86400, Date + 86400)` determines the mute date based on whether the holiday is on a Friday or another day. If it's Friday, it adds 3 days (including the weekend) to the mute date; otherwise, it adds only 1 day. Mute Alerts Logic: We then check if the current date matches the `mute_date`, setting `mute_alert` accordingly. Final Filtering: The `where` clause filters results to only keep entries where alerts should still fire, aligning with your requirements. This should successfully mute alerts on the day following any holiday based on the criteria you've established. Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
I have a Holiday.csv file that imports dates for specific holiday dates. example: 2024-04-01 2026-12-29 2028-06-26 I am working on muting alerts during a day after the dates. So, if the h... See more...
I have a Holiday.csv file that imports dates for specific holiday dates. example: 2024-04-01 2026-12-29 2028-06-26 I am working on muting alerts during a day after the dates. So, if the holiday was on Monday, it shouldn't fire on Tuesday, if the holiday was on Tuesday, it shouldn't fire on Weds, etc. The weird one is if the holiday is on a Friday, then we actually don't want the alert to fire on Monday this is what I have for my query.  just not sure how I would add in the Friday scenario if I did   strftime(_time+86400,"%Y-%m-%d")  ```to add one day```  index=<search> | eval Date=strftime(_time,"%Y-%m-%d") | lookup holidays.csv HolidayDate as Date output Holiday | eval should_alert=if((holidays.csv!="" AND isnull(Holiday)), "Yes", "No") | table Date should_alert | where should_alert="Yes" If something like this is possible in Splunk, I think it would work: if holiday is a Friday, add 3 days, otherwise add 1 day
Hi @eandres  In Splunk, when defining a lookup within transforms.conf, the time_field parameter is used to specify a field in the lookup table that represents a timestamp. This allows Splunk to appl... See more...
Hi @eandres  In Splunk, when defining a lookup within transforms.conf, the time_field parameter is used to specify a field in the lookup table that represents a timestamp. This allows Splunk to apply time-based filtering, ensuring that lookup results are relevant to the event’s timestamp How to Troubleshoot and Fix Verify the format of timestamps in your lookup file and ensure time_format matches. Check if your events fall within the expected time range of the lookup. Test the lookup manually using | inputlookup my_lookup to confirm that timestamps are stored correctly. Remove time_field if time-based filtering is not required. Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will