All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Could you try running it once with sudo? This should allow you to accept the license agreement, and then Splunk will use system privileges to set up the systemd service. Afterwards it should be contr... See more...
Could you try running it once with sudo? This should allow you to accept the license agreement, and then Splunk will use system privileges to set up the systemd service. Afterwards it should be controllable with systemctl.
Yeah this would be a nice feature to have. There are some Splunk Ideas suggesting it:  https://ideas.splunk.com/ideas/EID-I-1236 https://ideas.splunk.com/ideas/PLECID-I-424 You could add some vo... See more...
Yeah this would be a nice feature to have. There are some Splunk Ideas suggesting it:  https://ideas.splunk.com/ideas/EID-I-1236 https://ideas.splunk.com/ideas/PLECID-I-424 You could add some votes to them and tell your friends to vote as well. Hopefully the Splunk devs will then add a search bar to this dropdown.
You can use a SEDCMD to replace all the single quotes with double-quotes before indexing. in Props.conf: [yoursourcetype] SEDCMD-singletodouble=s/\'/\"/g
Hi @gcusello  thanks for you reply and help . Yes i did the followings: 1- Installed sysmon on my PC  2- Installed Splunk forwarder on my PC  3- Configured the inputs.conf by copying to  4- Alrea... See more...
Hi @gcusello  thanks for you reply and help . Yes i did the followings: 1- Installed sysmon on my PC  2- Installed Splunk forwarder on my PC  3- Configured the inputs.conf by copying to  4- Alread created index=winpc index on splunk  5- Dont see my PC / hostname from index=_internal logs last 30 days only see splunk hostname 6- Do I need to install universal forwarder credentail packge i tried but its fails when try to run the given comand here on : https://docs.splunk.com/Documentation/Forwarder/9.1.0/Forwarder/ConfigSCUFCredentials#Install_the_forwarder_credentials_on_individual_forwarders_in_Windows 7- My splunk unversal forwarder was installed to C:\Program Files\SplunkUniversalForwarder and service is running.       
Hello.  I have a data source that is "mostly" json formatted, except it uses single quotes instead of double, therefore, splunk is not honoring it if I set the sourcetype to json. If I run a query ... See more...
Hello.  I have a data source that is "mostly" json formatted, except it uses single quotes instead of double, therefore, splunk is not honoring it if I set the sourcetype to json. If I run a query against it using this: sourcetype="test" | rex field=_raw mode=sed "s/'/\"/g" | spath it works fine, and all fields are extracted. How can I configure props and transforms to perform this change at index time so that my users don't need to have the additional search parameters and all the fields are extracted by default, short of manually extracting each field? Example event, no nested fields: {'date': '2024-02-10', 'time': '18:59:27', 'field1': 'foo', 'field2': 'bar'}
Andreas, thank for the quick response.   Unfortunately, I am using Splunk Cloud, and I see in your "curl.py" file that VERIFYSSL is "Forced to be True for Splunk Cloud Compatibility". So, while "cu... See more...
Andreas, thank for the quick response.   Unfortunately, I am using Splunk Cloud, and I see in your "curl.py" file that VERIFYSSL is "Forced to be True for Splunk Cloud Compatibility". So, while "curl -k" works from the LINUX command line on my Splunk server,  in Splunk SPL the "| curl verifyssl=false" is overridden in the add-on's python code. Is there any way to override ??? If not, I will have to find another way to do this, as I am constrained by my environment.
Just happened to us now... Do we know if this fixed it and\or what was the initial cause.  This was just after a splunkd restart.
You probably have configured also this https://docs.splunk.com/Documentation/SplunkCloud/latest/Security/ConfigureauthextensionsforSAMLtokens ? Maybe it’s time for support ticket?
Hi @Josua.Panjaitan, Did the reply above help? If so, take a quick second to click the "Accept as Solution" button on the reply that helped. If not, reply to this thread and keep the conversation g... See more...
Hi @Josua.Panjaitan, Did the reply above help? If so, take a quick second to click the "Accept as Solution" button on the reply that helped. If not, reply to this thread and keep the conversation going. 
Hi @Sarath Kumar.Sarepaka, If you have not yet seen this AppDynamics Documentation, please check it out and see if it helps.
Not so many or floody inputs. Maybe you still should add another pipeline and check if it helps? Based on amount of entries from audit.log it is quite low. Can you check is there really so few entrie... See more...
Not so many or floody inputs. Maybe you still should add another pipeline and check if it helps? Based on amount of entries from audit.log it is quite low. Can you check is there really so few entries on source? If those are entries from one Linux node from 90 minutes period it’s really unused.
Thank you Isoutamo. I have Classic experience.
I am collecting logs from some files from /var/log and sysmon from journald. last 90 minutes /opt/splunkforwarder/var/log/splunk/audit.log 41 /opt/splunkforwarder/var/log/splunk/health.log ... See more...
I am collecting logs from some files from /var/log and sysmon from journald. last 90 minutes /opt/splunkforwarder/var/log/splunk/audit.log 41 /opt/splunkforwarder/var/log/splunk/health.log 39 /opt/splunkforwarder/var/log/splunk/metrics.log 8911 /opt/splunkforwarder/var/log/splunk/splunkd.log 598 /var/log/audit/audit.log 7 /var/log/messages 936 /var/log/secure 10 journald://sysmon 919   inputs.conf [monitor:///var/log/syslog] disabled = 0 sourcetype = syslog index = linux [monitor:///var/log/messages] disabled = 0 sourcetype = syslog index = linux [monitor:///var/log/secure] disabled = 0 sourcetype = linux_secure index = linux [monitor:///var/log/auth.log] disabled = 0 sourcetype = linux_secure index = linux [monitor:///var/log/audit/audit.log] disabled = 0 sourcetype = linux_audit index = linux [journald://sysmon] interval = 5 journalctl-quiet = true journalctl-include-fields = PRIORITY,_SYSTEMD_UNIT,_SYSTEMD_CGROUP,_TRANSPORT,_PID,_UID,_MACHINE_ID,_GID,_COMM,_EXE journalctl-exclude-fields = __MONOTONIC_TIMESTAMP,__SOURCE_REALTIME_TIMESTAMP journalctl-filter = _SYSTEMD_UNIT=sysmon.service sourcetype = sysmon:linux index = linux   I did not change number of pipelines. I thing that default count is 1. I will find out the OS version later. I do not have direct access to the OS. I thing it is CentOS/Redhat 8 or 9, but I may be wrong.  
Hi @Taruchit , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all th... See more...
Hi @Taruchit , let us know if we can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors
Hi @Nivedita.Kumari, I'm unsure if you are using Dashboard or Dash Studio, but here is a link to all our Dashboard Documentation. 
I was wondering if there was a splunk app or a feature available to have a search bar when filtering by Splunk App.    Every time, you have to scroll for a bit just looking for the correct Splu... See more...
I was wondering if there was a splunk app or a feature available to have a search bar when filtering by Splunk App.    Every time, you have to scroll for a bit just looking for the correct Splunk app, even if its just the search app. Is there a way to add a search bar for the apps? We have on for other pages and options.     I may be overlooking something. 
thanks a lot @Esky73 
My presentation about Data Onboarding for Helsinki UG. https://data-findings.com/wp-content/uploads/2024/04/Data-OnBoarding-2024-04-03.pdf  It contains some hints and workflow how you could test dat... See more...
My presentation about Data Onboarding for Helsinki UG. https://data-findings.com/wp-content/uploads/2024/04/Data-OnBoarding-2024-04-03.pdf  It contains some hints and workflow how you could test data onboarding on your own workstation.
Which kind of logs you are collecting? Is it possible that there is some log or input which stalled this after it has read and then UF just wait free resources to read next one? Have you only one or... See more...
Which kind of logs you are collecting? Is it possible that there is some log or input which stalled this after it has read and then UF just wait free resources to read next one? Have you only one or several pipelines in your UF? Any performance data from OS level and which OS, version you have?
Hi @gcusello, Thank you for sharing your inputs. I have a report which fetches last seen timestamp of hosts across multiple indexes. I store the results in lookup file, and then use the lookup fil... See more...
Hi @gcusello, Thank you for sharing your inputs. I have a report which fetches last seen timestamp of hosts across multiple indexes. I store the results in lookup file, and then use the lookup file as a bounded static source from where we can read the results in other reports and dashboards as required. It helps me with two scenarios: - 1. If the report that generates results fails because of some reason, and as the result the downstream dashboards and reports that consume the data will also get impacted. And I will need to wait for Operations team to help with the issue or wait until the report runs again and hope that it runs successfully in the next execution. 2. Since I am referring a lookup file, the fetching and searching of records in SPLs written for reports and dashboards get faster. Please share if you have any views to consider and improve. Thank you