You will not be able to put the KV Store into maintenance mode with a dynamic captain, to get around this you can temporarily change to a static captain using the following command /opt/splunk/bin/s...
See more...
You will not be able to put the KV Store into maintenance mode with a dynamic captain, to get around this you can temporarily change to a static captain using the following command /opt/splunk/bin/splunk edit shcluster-config -mode member -captain_uri https://your-Captain-SH-address:8089 -election false After this you should be able to check that dynamic_captain is 0 (splunk show shcluster-status) and then be able to enable maintenance mode. Please let me know how you get on and consider upvoting/karma this answer if it has helped. Regards Will
I have resolved this issue. The cause was in the UF outputs.conf configuration. Thank you all for your help. However, I don't understand why this configuration is required. I have posted a new que...
See more...
I have resolved this issue. The cause was in the UF outputs.conf configuration. Thank you all for your help. However, I don't understand why this configuration is required. I have posted a new question. https://community.splunk.com/t5/Security/Questions-about-UF-outputs-conf-Configuration/m-p/710701#M18322
I am configuring TLS communication between UF (Universal Forwarder) and Indexer. My outputs.conf configuration is as follows: [tcpout]
defaultGroup = default-autolb-group
[tcpout-server://xxxxx...
See more...
I am configuring TLS communication between UF (Universal Forwarder) and Indexer. My outputs.conf configuration is as follows: [tcpout]
defaultGroup = default-autolb-group
[tcpout-server://xxxxxxx:9997]
[tcpout:default-autolb-group]
server = xxxxxxx:9997
disabled = false
sslPassword = ServerCertPassword
sslRootCAPath = /opt/splunkforwarder/etc/auth/mycerts/myCertAuthCertificate.pem
sslVerifyServerCert = false
useACK = true
sslCertPath = /opt/splunkforwarder/etc/auth/mycerts/myCombinedServerCertificate.pem I have three questions: 1. I don't need a client certificate right now. If I don't set sslCertPath, an error occurs. Is this option mandatory? 2. Currently, I have set sslCertPath to the server certificate, and TLS communication works. Why do I need to set the server certificate on the client? Is this a common practice? 3. If I want to use a client certificate, which configuration setting should I use?
Hi @SN1 You can modify the search below to use the metrics.log to get this information, update the series= value with the index name you want to look at, and you may also want to exclude your index...
See more...
Hi @SN1 You can modify the search below to use the metrics.log to get this information, update the series= value with the index name you want to look at, and you may also want to exclude your indexer(s) as these also collect the metrics on index thruput index=_internal series=YourIndex group=per_index_thruput host!=YourIndexer*
| eval gb=kb/1024/1024
| timechart sum(gb) AS gb by host This will give a chart showing the GB of data for each forwarder. Please let me know how you get on and consider upvoting/karma this answer if it has helped. Regards Will
Hello I have a index name msad and i want to know which forwarder is sending data to this index . And also the data it is sending is stored where like from where this forwarder is sending this data.
Hi @KwonTaeHoon Have you installed the Python for Scientific Computing (PSC) app from Splunkbase? This is a pre-req for MLTK (see https://docs.splunk.com/Documentation/MLApp/5.5.0/User/Installandco...
See more...
Hi @KwonTaeHoon Have you installed the Python for Scientific Computing (PSC) app from Splunkbase? This is a pre-req for MLTK (see https://docs.splunk.com/Documentation/MLApp/5.5.0/User/Installandconfigure) The pandas library is within the PSC app at: (Splunk_SA_Scientific_Python_linux_x86_64)/bin/linux_x86_64/4_2_2/lib/python3.9/site-packages/pandas This is assuming you are running the latest PSC app on linux_x86_64. Please let me know how you get on and consider upvoting/karma this answer if it has helped. Regards Will
Why not start with your actual events? OK, assuming these now represent your events, try something like this instead | rex "The following products did not have mappings from PC: (?<product>\S+)"
For your first issue, relating to the data not being received into Splunk, please check the your inputs.conf on the UF is setup to the same index name as you have defined in the indexes.conf on your ...
See more...
For your first issue, relating to the data not being received into Splunk, please check the your inputs.conf on the UF is setup to the same index name as you have defined in the indexes.conf on your indexers. You should also check that your user has permissions to see this index on your searchhead. Regarding the mongo issue, please can you confirm which version of Splunk you are running and if was a fresh install or an upgrade from a previous version? Its worth starting with splunkd.log/mongod.log in $SPLUNK_HOME/var/log/splunk (or look in the _internal index for these logs) to see if there are any error/fatal/critical errors that might point to why Mongo isnt starting. If you find anything in the logs then let us know here and we can try and help you work through it. Regards Will
There arent any official Cloudformation Templates for sending data from Cloudtrail to S3/SQS as far as I know, however there is Project Trumpet (https://github.com/splunk/splunk-aws-project-trumpet) ...
See more...
There arent any official Cloudformation Templates for sending data from Cloudtrail to S3/SQS as far as I know, however there is Project Trumpet (https://github.com/splunk/splunk-aws-project-trumpet) which helps create Cloudformation for HEC based Cloudtrail (and others) feeds from AWS, if this helps? If you are using Splunk Cloud then you can also use the Data Manager app which can help setup AWS feeds into Splunk, but again I believe this uses Firehose/HEC rather than SQS based S3. I hope one of these helps you get started. Will
Below are the steps which need to perform for this password update; To update newer credentials in Qualys TA for Spunk, follow the below steps: Performed from Splunk Support side- =Click Setting...
See more...
Below are the steps which need to perform for this password update; To update newer credentials in Qualys TA for Spunk, follow the below steps: Performed from Splunk Support side- =Click Settings> DATA> Data inputs. =On the Data inputs screen, click TA-QualysCloudPlatform. =On the Qualys screen, disable all the listed data inputs. =Open Linux console terminal. =Delete passwords.conf file (/opt/splunk/etc/apps/TA-QualysCloudPlatform/local/passwords.conf). =Reboot the idm splunk instance. (This can be done anytime, please dont ask for maintenance window - Just inform us once this completed so we can performed from our end) Performed from customer side- =Go to the Splunk UI, click Apps > Manage Apps. =Click Setup against the Qualys Technology Add-on for Splunk option. =On the TA-QualysCloudPlatform screen, enter new credentials under Qualys Credentials. =Click Save. Performed from Splunk side- =Open Linux console terminal. =Navigate through the path: /opt/splunk/etc/apps/TA-QualysCloudPlatform/local/ and check if the passwords.conf file created. =On the Qualys screen, enable all the listed data inputs.
[ERROR] 2025-02-05 08:24:33.165 [http-nio-8080-exec-10] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercial...
See more...
[ERROR] 2025-02-05 08:24:33.165 [http-nio-8080-exec-10] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16
[ERROR] 2025-02-05 08:24:33.165 [http-nio-8080-exec-10] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16
[ERROR] 2025-02-05 08:24:33.165 [http-nio-8080-exec-10] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16
[ERROR] 2025-02-05 08:08:33.464 [http-nio-8080-exec-12] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16
[ERROR] 2025-02-05 08:04:21.339 [http-nio-8080-exec-73] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/b75f6bcde90f4aceaf9edbbeb13c5e58 These are the logs and i want to extract string for example HIGCommercialAuto just before the higawsaccountid string
[ERROR] 2025-02-05 08:24:33.165 [http-nio-8080-exec-10] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercial...
See more...
[ERROR] 2025-02-05 08:24:33.165 [http-nio-8080-exec-10] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16
[ERROR] 2025-02-05 08:24:33.165 [http-nio-8080-exec-10] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16
[ERROR] 2025-02-05 08:24:33.165 [http-nio-8080-exec-10] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16
[ERROR] 2025-02-05 08:08:33.464 [http-nio-8080-exec-12] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16
[ERROR] 2025-02-05 08:04:21.339 [http-nio-8080-exec-73] com.thehartford.bi.mm.clearanceapp.services.policysummary.impl.HFPProduct - The following products did not have mappings from PC: HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/b75f6bcde90f4aceaf9edbbeb13c5e58
hi @z_diddy There doesn't seem to be a way to remove this dot. Even looking through the source code documentation there are no properties for it https://docs.splunk.com/Documentation/SplunkCloud...
See more...
hi @z_diddy There doesn't seem to be a way to remove this dot. Even looking through the source code documentation there are no properties for it https://docs.splunk.com/Documentation/SplunkCloud/9.3.2408/DashStudio/dashDef There doesn't seem to be a constant CSS class either to override styles with. Perhaps you could suggest styling this dot at https://ideas.splunk.com/