HIGCommercialAuto higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16 MLM-RS-H higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16 MLM-R3-N higawsaccountid: 463251740121 higawslogstream: app-5091-prod-1-ue1-EctAPI/EctAPI/17eea8553cb8434bb4c126047817da16 These are basically 3 different logs and the highlighted one needs to extarcted in filed product_name
Regular expressions work on pattern matching and two examples is not many to secure a reliable pattern, that being said, if your data has already been extracted into the vs_name field, you could try ...
See more...
Regular expressions work on pattern matching and two examples is not many to secure a reliable pattern, that being said, if your data has already been extracted into the vs_name field, you could try something like this | rex field=vs_name "^\/[^\/]+\/(?<app_name>\w+)\-"
Hi @wadekuhl , as also @richgalloway said, you have to download the add-on from your Splunk Cloud instance. One addition hint: if you have many on premise systems (devices, pcs, servers, etc...), i...
See more...
Hi @wadekuhl , as also @richgalloway said, you have to download the add-on from your Splunk Cloud instance. One addition hint: if you have many on premise systems (devices, pcs, servers, etc...), it's a best practice to have two Heavy Forwarders as concentrators of all the on-premise systems; in this way, you must open only the connections between these two systems and Splunk Cloud, instead of all systems. In this case, you have to install the add-on only on these two systems and not on all systems. Ciao. Giuseppe
Hi @splunklearner , maybe you should redesign your indexes because hundreds of indexes are really too many! About the dashboard, you could configure your input to not automatically run the searches...
See more...
Hi @splunklearner , maybe you should redesign your indexes because hundreds of indexes are really too many! About the dashboard, you could configure your input to not automatically run the searches (no defaut value) so all the users (also admins) must choose the indexes to use in the search. Or for admins, create a different search with an additional panel (with a fast search) to select only one or few indexes to display. Last choose (the most structured): put your data in a custom Data Model and use it in the dashboard searches. Ciao. Giuseppe
Hi @Andre_ , this is the mechanism used by the Deployment Server, so you can apply it, but it requires a restart of the local Splunk every time. Are you sure that my hint isn't applicable? Ciao. ...
See more...
Hi @Andre_ , this is the mechanism used by the Deployment Server, so you can apply it, but it requires a restart of the local Splunk every time. Are you sure that my hint isn't applicable? Ciao. Giuseppe
Hi, Please extract DUSTER and JUNIPER as app_name from following sample events - 1. unit_hostname="GBWDC111AD011HMA.systems.uk.fed" support_id="16675049156208762610" vs_name="/f5-tenant-01/DUS...
See more...
Hi, Please extract DUSTER and JUNIPER as app_name from following sample events - 1. unit_hostname="GBWDC111AD011HMA.systems.uk.fed" support_id="16675049156208762610" vs_name="/f5-tenant-01/DUSTER-GBM-FR-DEV/v-dusteruat.systems.uk.fed-443" policy_name="/Common/waf-fed-transparent" 2. unit_hostname="GBWDC111AD011HMA.systems.uk.fed" support_id="16675049156208762610" vs_name="/f5-tenant-01/JUNIPER-GBM-FR-DEV/v-juniperuat.systems.uk.fed-443" policy_name="/Common/waf-fed-transparent" The app_names will be dynamic and there is no gurantee that everytime GBM will not be coming beside app_names. I tried this - vs_name=\"\/.*\/(?<app_name>.*)\-GBM but as I told everytime GBM will not same in all events. Please make it generic and give the regex for me. Thanks
@Cartoon520 Enable SSL: Select this check box to enable Secure Sockets Layer (SSL) encryption for the connection. SSL support is not available for all connection types. For further information, see...
See more...
@Cartoon520 Enable SSL: Select this check box to enable Secure Sockets Layer (SSL) encryption for the connection. SSL support is not available for all connection types. For further information, see http://docs.splunk.com/Documentation/DBX/3.18.1/DeployDBX/Installdatabasedrivers and http://docs.splunk.com/Documentation/DBX/3.18.1/DeployDBX/Installdatabasedrivers#Enable_SSL_for_your_database_connection https://docs.splunk.com/Documentation/DBX/3.18.1/DeployDBX/Installdatabasedrivers#Enable_SSL_for_your_database_connection
@alin Stop Splunk Enterprise Find the passw file for your instance ($SPLUNK_HOME/etc/passwd) and rename it to passwd.bk Create a file named user-seed.conf in your $SPLUNK_HOME/etc/system/local/ d...
See more...
@alin Stop Splunk Enterprise Find the passw file for your instance ($SPLUNK_HOME/etc/passwd) and rename it to passwd.bk Create a file named user-seed.conf in your $SPLUNK_HOME/etc/system/local/ directory. In the file add the following text: [user_info] PASSWORD = NEW_PASSWORD In the place of "NEW_PASSWORD" insert the password you would like to use. Start Splunk Enterprise and use the new password to log into your instance from Splunk Web. If you previously created other users and know their login details, copy and paste their credentials from the passwbk file into the passwd file and restart Splunk
You must have heard the phrase, time is of essence. This is especially true in time series such as Splunk. Could you start from the beginning and describe your use case? What is the input, what is...
See more...
You must have heard the phrase, time is of essence. This is especially true in time series such as Splunk. Could you start from the beginning and describe your use case? What is the input, what is the expected output, and what is the logic between input and expected output without SPL?
Hello I have a question about using python library in the algorithm of Splunk ML Toolkit. Open the ARIMA.py file in the path splunk/etc/apps/Splunk_ML_Toolkit/bin/algos as below. === Contents ...
See more...
Hello I have a question about using python library in the algorithm of Splunk ML Toolkit. Open the ARIMA.py file in the path splunk/etc/apps/Splunk_ML_Toolkit/bin/algos as below. === Contents === [root@master algos]# pwd /opt/splunk/etc/apps/Splunk_ML_Toolkit/bin/algos [root@master algos]# [root@master algos]# more ARIMA.py #!/usr/bin/env python import datetime import pandas as pd import numpy as np from statsmodels.tsa.arima.model import ARIMA as _ARIMA from statsmodels.tools.sm_exceptions import MissingDataError ========================= Among the contents of ARIMA.py , it says import pandas aspd Where is Pandas bringing up the library in? When I run ARIMA.py as below, I get a message that the module is not found. === Execution Results === [root@master algos]# python3 ARIMA.py Traceback (most recent call last): File "ARIMA.py", line 5, in <module> import pandas as pd ModuleNotFoundError: No module named 'pandas' [root@master algos]#
Currently we connect to PostgreSQL database using username/password authentication. Now we need to switch to certificate based authentication. I've created certificate in the server. Can anyone plea...
See more...
Currently we connect to PostgreSQL database using username/password authentication. Now we need to switch to certificate based authentication. I've created certificate in the server. Can anyone please guide me how to configure this in DBConnect Web GUI?
This worked for me on a fresh dashboard, thank you. I tried this on an existing dashboard though and quickly found out that if you get the numbers even off a little, you end up hiding/deleting (I ...
See more...
This worked for me on a fresh dashboard, thank you. I tried this on an existing dashboard though and quickly found out that if you get the numbers even off a little, you end up hiding/deleting (I wasn't sure what actually happened) other panels elsewhere on the dashboard - It's like they get pushed off into the ether. I ended up having to rebuild the dashboard from scratch A friendly heads-up for anyone that comes along in the future!
Hello. I have created an index under a custom app from splunk web it is reflecting but we I have set up the univarsal forwarder to monitor logs for same index it is not reflecting anything on indexer...
See more...
Hello. I have created an index under a custom app from splunk web it is reflecting but we I have set up the univarsal forwarder to monitor logs for same index it is not reflecting anything on indexer. Also my kb store showing status failed and tell me to check mongod.log and splunk key , please help in this