All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@richgalloway We have a licenced Splunk standalone server.  My Customer want us to configure Active directory to authenticate all the Splunk users. 
@gcusello I will surely try this solution.
When searching in metrics.log for the indexers in SplunkCloud I'm seeing the following: group=pipeline, name=typing, processor=regexreplacement, cpu_seconds=0.002, executes=838, cumulative_hits=1037... See more...
When searching in metrics.log for the indexers in SplunkCloud I'm seeing the following: group=pipeline, name=typing, processor=regexreplacement, cpu_seconds=0.002, executes=838, cumulative_hits=10378371, in=113716, out.splunk=111870, out.drop=1846 What is out.drop telling me here?  Am I losing data and what do I need to configure to not lose data?
What do you want to do with LDAP in Splunk?  Typical uses are to authenticate users and to query AD (for users, groups, etc.). Splunk's LDAP functionality is for authenticating Splunk users.  The LD... See more...
What do you want to do with LDAP in Splunk?  Typical uses are to authenticate users and to query AD (for users, groups, etc.). Splunk's LDAP functionality is for authenticating Splunk users.  The LDAP add-on allows for querying AD as part of a Splunk search. Yes, a standalone Splunk server on a Windows VM can connect to a Domain Controller using LDAP, but not under the Free license.
Hello everyone, I am looking for a SPL-solution to determine how long the longest common substring of two strings is. Is there any built-in way to do that? Or is there any app that provides me with... See more...
Hello everyone, I am looking for a SPL-solution to determine how long the longest common substring of two strings is. Is there any built-in way to do that? Or is there any app that provides me with some command for that?   Thanks in Advance!
Hello, please could you detail your problem a bit more? I am facing the same issue for a customer, the data before two weeks ago has disappeared, even if the resto of data belonging to _internal in... See more...
Hello, please could you detail your problem a bit more? I am facing the same issue for a customer, the data before two weeks ago has disappeared, even if the resto of data belonging to _internal index is still online. Thanks and best regards. Luca
Hi @uagraw01, if you want to use the LDAP authentication, you have to create a group on AD for each role you need in Splunk inserting the users of each one, configure your Splunk in [Settings > A... See more...
Hi @uagraw01, if you want to use the LDAP authentication, you have to create a group on AD for each role you need in Splunk inserting the users of each one, configure your Splunk in [Settings > Authentication Method > Configure Splunk to use LDAP > New LDAP] inserting the requested information, map the groups on your Splunk roles. for more details see at https://docs.splunk.com/Documentation/SplunkCloud/latest/Security/ConfigureLDAPwithSplunkWeb Ciao. Giuseppe
@gcusello We only want to stablish the authentication method. We dont want to monitor any LDAP events. We only use windows Splunk server even for production.
Hi @uagraw01, answering to your questions: 1) no, the LDAP functionalities are for user authentication, not to extract LDAP data. To extract LDAP data you need the add-On 2) use your stand alon... See more...
Hi @uagraw01, answering to your questions: 1) no, the LDAP functionalities are for user authentication, not to extract LDAP data. To extract LDAP data you need the add-On 2) use your stand alone server to install the app. and if it is possible pass to Linux: Windows is useful for test environments, not for production environments! Ciao. Giuseppe
Splunk Documentation - Release Notes: The official Splunk documentation provides detailed release notes for each version. You can find the release notes for the latest version: https://docs.sp... See more...
Splunk Documentation - Release Notes: The official Splunk documentation provides detailed release notes for each version. You can find the release notes for the latest version: https://docs.splunk.com/Documentation/Splunk/9.2.0/ReleaseNotes/MeetSplunk  https://docs.splunk.com/Documentation/Splunk/9.2.0/UpgradeReadiness/Releasenotes  These release notes include information about new features, enhancements, and resolved issues. Upgrade Readiness App: Install the Upgrade Readiness App in your Splunk environment. This app includes a tab to scan for Splunk platform compatibility. It assesses if your deployment is ready for an upgrade to a specific version (e.g., Splunk Enterprise 9.0). Additionally, all active admin or sc_admin users receive weekly email notifications by default https://docs.splunk.com/Documentation/Splunk/9.2.0/UpgradeReadiness/Releasenotes  Remember to keep an eye on the official Splunk channels, explore the community forums, and subscribe to relevant notifications to stay up-to-date with Splunk Enterprise releases. Happy Splunking!  
@gcusello Thank you for the response. I have few more ask on this. 1. Can we use LDAP functionality which is present in Splunk setting itself rather any Ldap app or add-on ? 2. We have standalone S... See more...
@gcusello Thank you for the response. I have few more ask on this. 1. Can we use LDAP functionality which is present in Splunk setting itself rather any Ldap app or add-on ? 2. We have standalone Splunk server which is based on windows virtual machine. So its possible a direction connection of Domain controller with Splunk server with splunk LDAP setting ?
Hi @jatin, download the app from https://splunkbase.splunk.com/app/1151 and follow the instruction at https://docs.splunk.com/Documentation/SA-LdapSearch/3.0.8/User/AbouttheSplunkSupportingAdd-onfor... See more...
Hi @jatin, download the app from https://splunkbase.splunk.com/app/1151 and follow the instruction at https://docs.splunk.com/Documentation/SA-LdapSearch/3.0.8/User/AbouttheSplunkSupportingAdd-onforActiveDirectory Ciao. Giuseppe
@acavenago  Ensure that your HFs are correctly configured and connected to the Splunk environment. Verify that the HFs are sending data to the indexers and are part of the cluster. Verify that com... See more...
@acavenago  Ensure that your HFs are correctly configured and connected to the Splunk environment. Verify that the HFs are sending data to the indexers and are part of the cluster. Verify that communication between the indexers and HFs is functioning correctly. In case you’re setting up for an indexer cluster or search head cluster then you must need to set up a cluster label. In case of indexer cluster: Go to the CLI of your master node. And run this following command: splunk edit cluster-config -cluster_label <CLUSTER LABEL> Add search peers: 1. Log in to the instance which you want to set up as a monitoring console (in our case it will be the master node) 2. Go to Setting and Distributed Search. And click on Search Peer. 3. Click on new search peer and add all search head, license master, non-clustered indexers, and clustered search head. 4. Repeat this process several times based on the number of instances you want to add. 5. We don’t need to add a master node here because we are doing all of this stuff into master nodes only. So it will automatically add. 6. Now go to the setting > monitoring console > setting > general setup 7. Click on distributed and continue. 8. Come down and check the status of all remote instances. 9. Check server roles are showing correct roles for that particular instance or not, if not then click on action > edit and edit server roles. 10. Now go to the overview page of your newly set up monitoring console. Forwarder setup in monitoring console: First, go to your newly set up monitoring console and click on forwarders and forwarders: instance. Now click on setup, to configure this page. Now enable, forwarder monitoring and choose data collection intervals. Then click on save and continue. Then this process or search will fetch all of your forwarder assets and will build a forwarder management dashboard within the monitoring console by running a scheduled search named “DMC Forwarder – Build Asset Table”. After doing those above steps you will avail to see all of your forwarder’s information as shown below.      
Thank you!
Hi @uagraw01, you should install in an Heavy Forwarder the SA-LDAPSearch app (https://splunkbase.splunk.com/app/1151) and follow the instructions at https://docs.splunk.com/Documentation/SA-LdapSear... See more...
Hi @uagraw01, you should install in an Heavy Forwarder the SA-LDAPSearch app (https://splunkbase.splunk.com/app/1151) and follow the instructions at https://docs.splunk.com/Documentation/SA-LdapSearch/3.0.8/User/AbouttheSplunkSupportingAdd-onforActiveDirectory In apps.splunk.com there is another app to do the same thing but I never used it. Ciao. Giuseppe
Numbers always sort in numerical order.  To change the order, change the numbers by adding the year to them. | eval weeknum=strftime(_time, "%y-%V") | chart dc(Task_num) as Tasks over weeknum by STA... See more...
Numbers always sort in numerical order.  To change the order, change the numbers by adding the year to them. | eval weeknum=strftime(_time, "%y-%V") | chart dc(Task_num) as Tasks over weeknum by STATUS
You can find a list of third-party software used by Splunk at https://docs.splunk.com/Documentation/Splunk/latest/ReleaseNotes/Credits  The release notes for other Splunk products should have a simil... See more...
You can find a list of third-party software used by Splunk at https://docs.splunk.com/Documentation/Splunk/latest/ReleaseNotes/Credits  The release notes for other Splunk products should have a similar list.
Have you tried incorporating the time zone in the strptime call? | eval stime=strptime(s_time,"%Y-%m-%dT%H:%M:%S%Z")  
TIME_FORMAT is one of the "Great 8" settings every sourcetype should have.  They help ensure events are onboarded properly.  See if these settings help. [exch_file_httpproxy-mapi] ANNOTATE_PUNCT = f... See more...
TIME_FORMAT is one of the "Great 8" settings every sourcetype should have.  They help ensure events are onboarded properly.  See if these settings help. [exch_file_httpproxy-mapi] ANNOTATE_PUNCT = false LINE_BREAKER = ([\r\n]+)\d\d\d\d-\d\d INDEXED_EXTRACTIONS = csv initCrcLength = 2735 HEADER_FIELD_LINE_NUMBER = 1 MAX_TIMESTAMP_LOOKAHEAD = 24 SHOULD_LINEMERGE = false TIMESTAMP_FIELDS = DateTime TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%3N%Z TRANSFORMS-no_column_headers = no_column_headers EVENT_BREAKER_ENABLE = true EVENT_BREAKER = ([\r\n]+)\d\d\d\d-\d\d TRUNCATE = 10000  
I have Log Analytics deployed through the agent machine using JOBs and I parse it through the grok expression. However, I noticed that I also receive data in the database that clearly do not match, w... See more...
I have Log Analytics deployed through the agent machine using JOBs and I parse it through the grok expression. However, I noticed that I also receive data in the database that clearly do not match, which means that they do not have an ERROR logLevel. Which I don't want to parse into columns, but I don't even want to have them in the database due to capacity. grok patterns: - "%{TIMESTAMP_ISO8601:logEventTimestamp}%{SPACE}\\[%{NUMBER:logLevelId}\\]%{SPACE}%{LOGLEVEL:logLevel}%{SPACE}-%{SPACE}%{GREEDYDATA:msg}" pattern.grok: LOGLEVEL ([Ee]rr?(?:or)?|ERR?(?:OR)?) Requered data: Unnecessary data:  I would be interested in how to get rid of them, or where they can be used clause where or a filter?