All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

thanks for that (API's challenge is we need Splunk application login which has been redacted for users to reduce footprint)   show-encrypted  is something I haven't tried, but seems promising. Wil... See more...
thanks for that (API's challenge is we need Splunk application login which has been redacted for users to reduce footprint)   show-encrypted  is something I haven't tried, but seems promising. Will test and get back
You're on the right track in wanting to move beyond exact latitude/longitude matches, as literal value matches are ineffective due to GPS inaccuracies and sampling rates. Instead, what you need is a ... See more...
You're on the right track in wanting to move beyond exact latitude/longitude matches, as literal value matches are ineffective due to GPS inaccuracies and sampling rates. Instead, what you need is a proximity-based approach using geospatial distance — specifically, the Haversine formula — to find the nearest segment point from your lookup file for each point in your car’s GPS trace.
With current_only = 1 On first start, the UF reads only new events that arrive after the input is enabled.It skips all historical events present in the log at the time the input is first started. I... See more...
With current_only = 1 On first start, the UF reads only new events that arrive after the input is enabled.It skips all historical events present in the log at the time the input is first started. If the UF is stopped and restarted, it will pick up where it left off (using checkpoints), so normally it will ingest events that occurred while it was down. #https://help.splunk.com/en/splunk-enterprise/get-started/get-data-in/9.4/get-windows-data/monitor-windows-event-log-data-with-splunk-enterprise Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a Karma. Thanks!
I've seen the "current_only" option but discarded that as it will not ingest any historical data.  If I set "current_only=1" during initial deployment it will not ingest old data - so far so good. ... See more...
I've seen the "current_only" option but discarded that as it will not ingest any historical data.  If I set "current_only=1" during initial deployment it will not ingest old data - so far so good. If the UF goes down for a period of time, after a restart it will not process the events that occurred whilst the UF was down - bad What happens if I deploy the UF with "current_only =1" and after a week I remove the setting? will it start ingesting all historical? Or could I use that as a temporary setting during the onboarding phase and remove for production phase?   Kind Regards Andre
@sawwinnaung  Try below, props.conf [linux_audit] TRANSFORMS-set = discard_proctitle [source::/var/log/audit/audit.log] TRANSFORMS-set = discard_proctitle transforms.conf [discard_proctitle] ... See more...
@sawwinnaung  Try below, props.conf [linux_audit] TRANSFORMS-set = discard_proctitle [source::/var/log/audit/audit.log] TRANSFORMS-set = discard_proctitle transforms.conf [discard_proctitle] REGEX = type=PROCTITLE DEST_KEY = queue FORMAT = nullQueue Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos/Karma. Thanks!
@Andre_  You are correct. Unlike file-based inputs, Windows Event Log inputs in Splunk Universal Forwarder (UF) do not provide a built-in option in inputs.conf to exclude events based on their age a... See more...
@Andre_  You are correct. Unlike file-based inputs, Windows Event Log inputs in Splunk Universal Forwarder (UF) do not provide a built-in option in inputs.conf to exclude events based on their age at collection time. This means you cannot natively configure the UF to only ingest Windows events newer than 7 days during onboarding. But, If you want to ingest only new Windows Event Log events (and skip all historical data), set current_only = 1 in your inputs.conf. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos/Karma. Thanks!
Hello, I am about to onboard 1000+ Windows UF. Those have windows event logs going back many years. Is there a way to exclude any windows eventlog older than 7 days from being ingested during the in... See more...
Hello, I am about to onboard 1000+ Windows UF. Those have windows event logs going back many years. Is there a way to exclude any windows eventlog older than 7 days from being ingested during the initial onboarding? For log files there's an option for inputs.conf on the UF, but nothing similar for eventlog? Kind Regards Andre
I am trying to setup props & transforms in indexers to send PROCTITLE events to null queue i tried below regex but that doesn't seem to work.  props.conf and transforms.conf location:   /app/splunk... See more...
I am trying to setup props & transforms in indexers to send PROCTITLE events to null queue i tried below regex but that doesn't seem to work.  props.conf and transforms.conf location:   /app/splunk/etc/apps/TA-linux_auditd/local/ props.conf [linux_audit] TRANSFORMS-set = discard_proctitle [source::/var/log/audit/audit.log] TRANSFORMS-set = discard_proctitle transforms.conf [discard_proctitle] REGEX = ^type=PROCTITLE.* DEST_KEY = queue FORMAT = nullQueue sample event-   type=PROCTITLE msg=audit(1750049138.587:1710xxxx): proctitle=737368643A206165705F667470757372205B70726xxxxx   type=PROCTITLE msg=audit(1750049130.891:1710xxxx): proctitle="(systemd)" type=PROCTITLE msg=audit(1750049102.068:377xxxx): proctitle="/usr/lib/systemd/systemd-logind" Could someone help me to fix this issue?  
@dinesh001kumar  1-Due to security restriction, normally you can't upload or reference custom js files directly in the cloud. You can raise a support request to include simple xml js extensions for ... See more...
@dinesh001kumar  1-Due to security restriction, normally you can't upload or reference custom js files directly in the cloud. You can raise a support request to include simple xml js extensions for review and to upload it for you. 2-You can use the <html> panel in Simple XML dashboards for custom HTML, but it is limited. Normally you cannot use inline JavaScript, and some HTML elements may be restricted. Alternative - Unfortunately use the built-in features of Dashboard Studio or Simple XML extensions. For specific needs, reach out to Splunk Cloud Support to request a review and upload of vetted JS/CSS files. Audio Support-Dashboard Studio does not natively support embedding or playing audio files. As a workaround, you could set up external alerting (e.g., send a webhook/email to a system that plays audio) Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos/Karma. Thanks!
Hello Splunkers The time selector is not visible for a specific user, whereas it is visible for the admin role. Could anyone please suggest which capability needs to be added to the user's role to... See more...
Hello Splunkers The time selector is not visible for a specific user, whereas it is visible for the admin role. Could anyone please suggest which capability needs to be added to the user's role to make the time selector visible on the dashboard?   Time selector is visible for the admin user.  
@Trevorator  I dont think there is any Splunk setting to enable automatic retry or continuation for .tsidx file operations. The only way to ensure all data is accelerated and .tsidx files are preser... See more...
@Trevorator  I dont think there is any Splunk setting to enable automatic retry or continuation for .tsidx file operations. The only way to ensure all data is accelerated and .tsidx files are preserved is to maintain a healthy infrastructure and address any resource limitations. Regards, Prewin Splunk Enthusiast | Always happy to help! If this answer helped you, please consider marking it as the solution or giving a kudos/Karma. Thanks!
My requirement is to create similar in Splunk Cloud.
There was an dashboard is created in Splunk Enterprise with using only HTML code along with Javascript and CSS file. Can you please help me clarify. 1.Is that Splunk Cloud will Support CSS and Js fi... See more...
There was an dashboard is created in Splunk Enterprise with using only HTML code along with Javascript and CSS file. Can you please help me clarify. 1.Is that Splunk Cloud will Support CSS and Js file. 2.Can we write complex HTML Code in Splunk Cloud Dashboard. If not possible means what will be alternative solution for above, can anyone please clarify. Also is it possible to add audio file in Studio dashboard. Since if there was drop in Success rate of transaction busser sound be initiated. 
Still not resolve the issue, could you please help any one.
@parthbhawsar  We have recently configured the Cisco FMC and successfully integrated it with Splunk. Could you please check the error you are encountering in Splunk so that I can assist you further?... See more...
@parthbhawsar  We have recently configured the Cisco FMC and successfully integrated it with Splunk. Could you please check the error you are encountering in Splunk so that I can assist you further? If you continue to face any issues, I would recommend reaching out to the Cisco TAC team for additional support.
@Splunkers2 Check this https://www.reddit.com/r/Splunk/comments/17msvh2/misp_integration_error/ 
If you're sending passwords via deployment apps, they generally need to be stored in plaintext on the Deployment Server. If this isn't acceptable in your environment, you'll probably end up with a wo... See more...
If you're sending passwords via deployment apps, they generally need to be stored in plaintext on the Deployment Server. If this isn't acceptable in your environment, you'll probably end up with a work-around. Use the REST API Remotely Are you able to access the HF remotely over port 8089 after the HF has been deployed? If so, the REST API method could still work. Instead of using 'localhost' as your hostname: curl -k -u admin:changeme \ https://hf.mydomain.com:8089/servicesNS/nobody/search/storage/passwords \ -d name=my_realm:my_api_key \ -d password=actual_api_token   Pre-Encrypt the Secret Instead of copying the splunk.secret to a Docker container, encrypt the secret locally on the HF. If you have shell access, you can run something like: read creds && $SPLUNK_HOME/bin/splunk show-encrypted --value "${creds}"   This will generate an encrypted version of $creds that is decryptable by that server's splunk.secret file. Put the string in the appropriate place in passwords.conf. Alternately, after obtaining the encrypted string, you could insert it into the deployment app on the DS, and leave it there in encrypted form (assuming, as mentioned above, you only need the credential on this one HF). Then the DS can push the entire app, encrypted password included, to the HF.
Would we be able to automatically run the misp_alert_sighting command based on traffic matching?
Your SPL has "tick" marks round the macro drop_dm_object_name that are single quotes ('), whereas you need to use the backtick character (`) | `drop_dm_object_name("All_Traffic")`  
I put a \s before and after the : because your example showed the space before, but your sed was replacing a space after. Put the \s* where the space can be. If you want to post examples, use the c... See more...
I put a \s before and after the : because your example showed the space before, but your sed was replacing a space after. Put the \s* where the space can be. If you want to post examples, use the code tag option in the editor </> so you can see exactly what you are posting. Like this...