All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @kn450  You should put the Cert/Intermediate(s)/CA in splunkWeb.pem but not the key, that should go in its own file (e.g. splunkWeb.key) and use the privKeyPath setting to set the location for th... See more...
Hi @kn450  You should put the Cert/Intermediate(s)/CA in splunkWeb.pem but not the key, that should go in its own file (e.g. splunkWeb.key) and use the privKeyPath setting to set the location for this. [settings] enableSplunkWebSSL = true privKeyPath = /opt/splunk/etc/apps/webTLS/certs/splunkWeb.key serverCert = /opt/splunk/etc/apps/webTLS/certs/splunkWeb.pem Note: You may use absolute paths when you configure these settings by prepending a / to the path. Non-absolute paths are relative to the Splunk installation directory ($SPLUNK_HOME). If you use a non-absolute path, do not add $SPLUNK_HOME to the path  If this does not work, please could you look in $SPLUNK_HOME/var/log/splunk/splunkd.log for any error logs which might indicate what is preventing it from starting?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
How enaple TLS in splunk platform from ca 
Hi @splunkville  The default configurations for a sourcetype can often be "good enough" for some logs, Splunk does a good job at determining timestamp extraction but if your logs contain multi-line ... See more...
Hi @splunkville  The default configurations for a sourcetype can often be "good enough" for some logs, Splunk does a good job at determining timestamp extraction but if your logs contain multi-line events, long lines (>10000 chars),multiple timestamps or anything like this then it might struggle or you might get mixed results. Its also worth noting that from a performance perspective its best to tweak these settings and incorporate the "Great 8" (See https://lantern.splunk.com/Splunk_Platform/Product_Tips/Data_Management/Configuring_new_source_types) to ensure accuracy but also to improve performance of the data being ingested.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @splunkville , yes, in general, if you configure a monitor you read the file, but what's your issue and you question? Are you working on a Universal Forwarder or a stand alone Splunk server or w... See more...
Hi @splunkville , yes, in general, if you configure a monitor you read the file, but what's your issue and you question? Are you working on a Universal Forwarder or a stand alone Splunk server or what else? Please, share more datails about your issue. Ciao. Giuseppe
Even then, the HISTCONTROL variable has to be set before running the command you don't want showing in history.  Also, contrary to what I keep reading from Google results, this variable doesn't seem ... See more...
Even then, the HISTCONTROL variable has to be set before running the command you don't want showing in history.  Also, contrary to what I keep reading from Google results, this variable doesn't seem to be automatically set, or at least not on the various Linux distros I've used.
Monitor set to pull in a watched log that has no props/transforms configs applied. This would ingest the entire file contents, correct? 
  Hello everyone, I’m encountering an issue when trying to enable secure HTTPS access on Splunk Web using an SSL certificate issued by a trusted external CA. What I did: Placed the SSL certif... See more...
  Hello everyone, I’m encountering an issue when trying to enable secure HTTPS access on Splunk Web using an SSL certificate issued by a trusted external CA. What I did: Placed the SSL certificate file (splunkWeb.pem) in the following path: $SPLUNK_HOME/etc/apps/webTLS/certs/splunkWeb.pem Edited the web.conf file with the following settings:   ini CopyEdit [settings] enableSplunkWebSSL = true serverCert = $SPLUNK_HOME/etc/apps/webTLS/certs/splunkWeb.pem privKeyPath = $SPLUNK_HOME/etc/apps/webTLS/certs/splunkWeb.pem   Restarted the Splunk service. Issue: After restarting, Splunk hangs during startup and the web interface does not become available over HTTPS. Questions: Are there additional steps required when using an external SSL certificate? Is the web.conf configuration correct, especially regarding the privKeyPath pointing to the same .pem file as serverCert? Should the private key be in a separate file from the certificate? Any advice or similar experiences would be greatly appreciated. Thank you in advance for your help!
Hi @OGS  You need to disable replication_port://9887 by either setting a disabled=true flag or ensuring it does not exist anywhere in your configuration - you can use btool to check: $SPLUNK_HOME/... See more...
Hi @OGS  You need to disable replication_port://9887 by either setting a disabled=true flag or ensuring it does not exist anywhere in your configuration - you can use btool to check: $SPLUNK_HOME/bin/splunk cmd btool server list --debug replication_port If you have replication_port *and* replication_port-ssl enabled then this might conflict.  Other things to note: serverCert must contain the server cert plus private key; sslPassword (if set) must be the private key’s passphrase (not the CA’s). sslRootCAPath must include the full trust chain (root + any intermediates). The names in sslCommonNameToCheck must match the CN/SANs in the peer certificates.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Cerum  At this time there isnt a Splunk app for OpenAI Enterprise Compliance - If you already have access to the OpenAI Compliance API (https://chatgpt.com/admin/api-reference) then you could lo... See more...
Hi @Cerum  At this time there isnt a Splunk app for OpenAI Enterprise Compliance - If you already have access to the OpenAI Compliance API (https://chatgpt.com/admin/api-reference) then you could look at using the Splunk UCC Framework to build a custom app to poll the logs. UCC gives a good starting point so if you're familiar with Python then you may be able to get something running quite quickly.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
Hi @SSEAL  Regarding FIPS - Check out https://help.splunk.com/en/appdynamics-on-premises/analytics/25.4.0/analytics/configure-analytics/configure-the-analytics-agent-for-fips-compliance and see if t... See more...
Hi @SSEAL  Regarding FIPS - Check out https://help.splunk.com/en/appdynamics-on-premises/analytics/25.4.0/analytics/configure-analytics/configure-the-analytics-agent-for-fips-compliance and see if this helps. Regarding Smartcard auth - This isnt something natively supported however you might have some success by using an idP which would act as the broker for the authentication - Check out https://docs.appdynamics.com/accounts/en/global-account-administration/access-management/configure-single-sign-on-through-saml for details on configurating SSO, from here you would need to determine if one of the idP can support your smartcard auth process. Do you already use an idP with your smartcards that can support SAML?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing  
This is a challenge because how different users might define "unused" and because there is no tag to say if data is "used" or not.  A good starting place, at least for Splunk Cloud users, is the "Und... See more...
This is a challenge because how different users might define "unused" and because there is no tag to say if data is "used" or not.  A good starting place, at least for Splunk Cloud users, is the "Underutilized source type(s)" alert in the CMC. Other users will need to craft a search that scrubs logs for sourcetype references and another search that compares those references to a list of all sourcetypes on the system.  It won't be perfect, either.  See .conf24 session PLA1837B for more info and SPL to get you started.
The settings for tls should be set the same way as they are on management port. Your configuration looks more or less correct. What do you mean by "doesn't work"? Remember that you need to have a wor... See more...
The settings for tls should be set the same way as they are on management port. Your configuration looks more or less correct. What do you mean by "doesn't work"? Remember that you need to have a working CA for mTLS to work. Self-signed certs most probably won't work.
What do you mean by "unused data"? And what granularity do you have in mind? You can search the Answers archives for similar questions - the general answers is there is no 100% sure way to find inde... See more...
What do you mean by "unused data"? And what granularity do you have in mind? You can search the Answers archives for similar questions - the general answers is there is no 100% sure way to find indexes/hosts/sources which are searched (and thus those which are not).
Good afternoon, I am very new to AppDynamics and have a lot of questions. I am in the middle of setting up on-premises, self-hosted virtual appliances directly to ESXi hosts. I have the hosts and ... See more...
Good afternoon, I am very new to AppDynamics and have a lot of questions. I am in the middle of setting up on-premises, self-hosted virtual appliances directly to ESXi hosts. I have the hosts and am attempting to configure them and have the following questions. I've also reviewed the online documentation for securing the platform, but do not see any references to any of my questions.  https://docs.appdynamics.com/appd/onprem/24.x/25.2/en/secure-the-platform Where may I find information for the following? 1.  Enabling FIPS and its related settings? 2.  Enabling FIPS and its related settings on private synthetic agents? 3.  How to set up smart card authentication? Any information you may provide is greatly appreciated. Thank you, Stuart
I figured out the issue.  It was a permissions issue.  I needed to put splunkfwd on the appropriate access lists.  I gave splunkfwd read access to /var/log/audit/audit.log and execute access to /var/... See more...
I figured out the issue.  It was a permissions issue.  I needed to put splunkfwd on the appropriate access lists.  I gave splunkfwd read access to /var/log/audit/audit.log and execute access to /var/log/audit.  Now splunkfwd can execute the script either manually from command line or as a scheduled scripted input run by Splunk UF.  In both cases, the script runs without error whether there is a pre-existing checkpoint in place or not. I understand that Splunk UF has the CAP_DAC_READ_SEARCH capability which allows it to read files it normally wouldn't have access to.  What I don't understand is why that capability worked fine when I asked it to generate the initial checkpoint, but then suddenly stopped working the moment that I asked it to use a pre-existing checkpoint.  Is it possible that the CAP_DAC_READ_SEARCH capability doesn't extend to reading the inode properties of each file?  If that were the case, it would explain why the initial ausearch went fine (when inode doesn't matter because ausearch is just ingesting all of the audit.log files regardless of inode), but then when ausearch needs to look for the specific audit.log file that matches the inode listed in the checkpoint file, it can't do so.   Thank you to @PickleRick and @isoutamo for your suggestions and assistance.  I couldn't have done it without you both.
I needed to preface the view name with /app/SplunkEnterpriseSecuritySuite/ Sample: Investigate Identity Artifacts - "/app/SplunkEnterpriseSecuritySuite/ident_by_name" Investigate Asset Artifacts... See more...
I needed to preface the view name with /app/SplunkEnterpriseSecuritySuite/ Sample: Investigate Identity Artifacts - "/app/SplunkEnterpriseSecuritySuite/ident_by_name" Investigate Asset Artifacts - "/app/SplunkEnterpriseSecuritySuite/asset_artifacts" Investigate File/Process Artifacts - "/app/SplunkEnterpriseSecuritySuite/file_artifacts"
What is the best app to detect unused data? any suggestions?
Has anyone had any luck getting Open AI Compliance API logs into Splunk Cloud? This API ships logs that provide visibility into prompts / replies with Chat GPT. Looking to ingest this data to monitor... See more...
Has anyone had any luck getting Open AI Compliance API logs into Splunk Cloud? This API ships logs that provide visibility into prompts / replies with Chat GPT. Looking to ingest this data to monitor for possible sensitive / confidential data being uploaded. Open AI has built in integrations with several applications https://help.openai.com/en/articles/9261474-compliance-api-for-enterprise-customers. Surprisingly, Splunk is not one of these applications. My question is, has anyone had any luck getting these logs into Splunk. I have the API key from Open AI - but I'm struggling with creating a solution to ingest these logs into Splunk - and honestly surprised their isn't a native application built by Splunk for this. 
Disabled the  evt_resolve_ad_obj = 0 in Splunk_TA_windows app , logs have now ceased.  
In a Splunk dashboard, I’m using the custom visualization "3D Graph Network Topology Viz". The goal is that when clicking on a node, a token is set so another panel can display related details. The... See more...
In a Splunk dashboard, I’m using the custom visualization "3D Graph Network Topology Viz". The goal is that when clicking on a node, a token is set so another panel can display related details. The issue is: When configuring On Click → Manage tokens on this dashboard, Splunk shows the message: "This custom visualization might not support drilldown behavior." When clicking on a node, the $click.value$ token does not update and literally remains as $click.value$, which confirms that it’s not sending dynamic values. The only token that actually receives data is $click.name$, which returns the node’s name, but not other values I’d like to capture. Has anyone successfully implemented full drilldown support in this visualization or knows how to extend it so that more tokens (like $click.value$) can be populated when clicking on a node?