All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

HI Team  Is it possible to use the inputlookup of csv file with 7 column and fill the details in those 7 columns using the search command that fetches the data from splunk ??  Examples:  My csv ... See more...
HI Team  Is it possible to use the inputlookup of csv file with 7 column and fill the details in those 7 columns using the search command that fetches the data from splunk ??  Examples:  My csv looks like this:  Column1 , Column2  Value A1 , Value B1 Value A2 , Value B2 Value A3 , Value B3 Value A4, Value B4 I need output like below :  Column1 , Column2 , Column3 , Column4 Value A1 , Value B1 , Value C1 , Value D1 Value A2 , Value B2 , Value C2 , Value D2 Value A3 , Value B3 , Value C3 , Value D3 Value A4, Value B4 , Value C4 , Value D4 Values of Column 3 and Column4 are fetched from Splunk using search command and using the key value of Column1.  I've tried to use the below search, but it is not working:  | inputlookup File.csv | join Column1 type=left  [ | tstats latest(Column3) as START_TIME ,                    latest(Column4) as END_TIME  where index = main source = xyz  ] | table Column1 , Column2 , START_TIME , END_TIME 
Hi, I have a token as below in my dashboard,                   <set token="mySource">replace($numSuffix$,"_","")</set> and I have another token utilising the above,                  <set tok... See more...
Hi, I have a token as below in my dashboard,                   <set token="mySource">replace($numSuffix$,"_","")</set> and I have another token utilising the above,                  <set token="source">$indexg$ connection $mySource"</set> In Query I have,                    <search>                          <query>$source$ | timechart count by host </query>                  </search> Unfortunately this is not working, In splunk section query, its not evaluating. Its reflecting as,                            index=xer connection replace(_45t66,"_","") | timechart count by host I tried with                           <set token="mySource">replace($numSuffix|s$,"_","")</set> But its of no use. Can someone help me with this?  Thanks.
I'm using this built-in lookup to determine the Country for gps coordinates as follows:   | lookup geo_countries latitude, longitude output featureId as Country   The issue is that this lookup do... See more...
I'm using this built-in lookup to determine the Country for gps coordinates as follows:   | lookup geo_countries latitude, longitude output featureId as Country   The issue is that this lookup doesn't anything for some coordinates. Some examples:   40.711157112847644,-74.01527355439009 40.8293703,-73.9709533 22.2866493,114.195508 -33.84808469677436,151.28320075054089 -38.0159081,-57.5320673 | makeresults | eval latitude="40.711157112847644" | eval longitude="-74.01527355439009" | lookup geo_countries latitude, longitude output featureId as Country   Google Maps is capable to find an approx location for above coordinates. Can anybody provide some guidance please. Many Thanks.        
Hello, We have separate indexes created for non-prod and prod.  Sample index name : sony_app_XXXXXX_non_prod - for non-prod env sony_app_XXXXXX_prod - for prod env XXXXX are Application ID numbe... See more...
Hello, We have separate indexes created for non-prod and prod.  Sample index name : sony_app_XXXXXX_non_prod - for non-prod env sony_app_XXXXXX_prod - for prod env XXXXX are Application ID numbers (different) and we have different indexes as well (along with non-prod and prod). I want a field called env which should pick index details like for all non-prod indexes, the env should be Non-Prod and for Prod indexes, env should be Prod. Given below command  index=sony*  |eval env= if(index="*non_prod*", "Non-Prod", "Prod"). This will not work for Prod because we have different indexes as well which not include either non_prod or prod. but it is giving all values as Prod in env.  Kindly help me with the solution to achieve this.  
Hi everyone, I am testing the Smart Agent appdcli utility and encountered an issue. When I try to UPGRADE a machine agent by running the following command: appd upgrade machine --inventory hosts.i... See more...
Hi everyone, I am testing the Smart Agent appdcli utility and encountered an issue. When I try to UPGRADE a machine agent by running the following command: appd upgrade machine --inventory hosts.ini --connection ssh --config config.ini --version latest The agent starts communicating with the Controller, but ServerMonitoring fails (see figure_1).  However, when I try to INSTALL the same agent and version by the following command: appd install machine --inventory hosts.ini --connection ssh --config config.ini --version latest Everything works fine (see figure_2). Do you have any idea why?  The problem only appears when I upgrade a machine agent running on Linux (Ubuntu-23.10 [mantic]). On Windows, I have not encountered this issue. Regards, Lukas
Hello Team We try to integrate Splunk Enterprise (Version: 9.3.2) with AppDynamics SaaS (Version 24.10). As per the document, we need to add the AppD SaaS IP address in the Search head API in splunk... See more...
Hello Team We try to integrate Splunk Enterprise (Version: 9.3.2) with AppDynamics SaaS (Version 24.10). As per the document, we need to add the AppD SaaS IP address in the Search head API in splunk. To add the IP address, I have navigated to Server Settings but unable to see "IP Allow List" option in splunk console. Note: I have logged into Splunk with admin ID. Please help me to fix the issue. Thanks, Selvaganesh E
Dashboard studio -Error while updating auto refresh value.  [Error: Visualization is not present in layout structure]: Visualization "viz_XQInZkvE" is not present in Layout Structure, My last panel... See more...
Dashboard studio -Error while updating auto refresh value.  [Error: Visualization is not present in layout structure]: Visualization "viz_XQInZkvE" is not present in Layout Structure, My last panel is: If i'm trying to change refresh rate from 2m to any other time i get above error. It looks like some default value or cloned from some other dashboard. Could somone help on this?  "title": "E2E Customer Migration Flow - MigrationEngine + NCRM Clone",     "description": "BPM Dashboard.SparkSupportGroup:Sky DE - Digital Sales - Product Selection & Registration",     "defaults": {         "dataSources": {             "ds.search": {                 "options": {                     "queryParameters": {                         "latest": "$global_time.latest$",                         "earliest": "$global_time.earliest$"                     },                     "refresh": "2m"                 }             }         }     } }
The Jamf Pro Add-On for Splunk does not work with Splunk Cloud. We have spent days trying to get this working with both Jamf and Splunk, only to find that this setup is currently incompatible. This ... See more...
The Jamf Pro Add-On for Splunk does not work with Splunk Cloud. We have spent days trying to get this working with both Jamf and Splunk, only to find that this setup is currently incompatible. This has been confirmed by both Jamf and Splunk. It appears that the 'Jamf Protect Add-On' is compatible with Splunk Cloud. Hopefully these two add-ons are similar in construction and the Jamf Pro Add-On can be updated ASAP. https://splunkbase.splunk.com/app/4729 https://learn.jamf.com/en-US/bundle/technical-paper-splunk-current/page/Integrating_Splunk_with_Jamf_Pro.html Thanks!
I am configuring TLS communication between UF (Universal Forwarder) and Indexer. My outputs.conf configuration is as follows:   [tcpout] defaultGroup = default-autolb-group [tcpout-server://xxxxx... See more...
I am configuring TLS communication between UF (Universal Forwarder) and Indexer. My outputs.conf configuration is as follows:   [tcpout] defaultGroup = default-autolb-group [tcpout-server://xxxxxxx:9997] [tcpout:default-autolb-group] server = xxxxxxx:9997 disabled = false sslPassword = ServerCertPassword sslRootCAPath = /opt/splunkforwarder/etc/auth/mycerts/myCertAuthCertificate.pem sslVerifyServerCert = false useACK = true sslCertPath = /opt/splunkforwarder/etc/auth/mycerts/myCombinedServerCertificate.pem   I have three questions: 1. I don't need a client certificate right now. If I don't set sslCertPath, an error occurs. Is this option mandatory? 2. Currently, I have set sslCertPath to the server certificate, and TLS communication works. Why do I need to set the server certificate on the client? Is this a common practice? 3. If I want to use a client certificate, which configuration setting should I use?
Hello I have a index name msad and i want to know which forwarder is sending data to this index . And also the data it is sending is stored where like from where this forwarder is sending this data.
KV Store maintenance mode change can only be done on static captain.
Hi,  Please extract DUSTER and JUNIPER as app_name from following sample events -  1. unit_hostname="GBWDC111AD011HMA.systems.uk.fed" support_id="16675049156208762610" vs_name="/f5-tenant-01/DUS... See more...
Hi,  Please extract DUSTER and JUNIPER as app_name from following sample events -  1. unit_hostname="GBWDC111AD011HMA.systems.uk.fed" support_id="16675049156208762610" vs_name="/f5-tenant-01/DUSTER-GBM-FR-DEV/v-dusteruat.systems.uk.fed-443" policy_name="/Common/waf-fed-transparent"    2. unit_hostname="GBWDC111AD011HMA.systems.uk.fed" support_id="16675049156208762610" vs_name="/f5-tenant-01/JUNIPER-GBM-FR-DEV/v-juniperuat.systems.uk.fed-443" policy_name="/Common/waf-fed-transparent"    The app_names will be dynamic and there is no gurantee that everytime GBM will not be coming beside app_names. I tried this - vs_name=\"\/.*\/(?<app_name>.*)\-GBM but as I told everytime GBM will not same in all events. Please make it generic and give the regex for me. Thanks
Hello I have a question about using python library in the algorithm of Splunk ML Toolkit. Open the ARIMA.py file in the path splunk/etc/apps/Splunk_ML_Toolkit/bin/algos as below. === Contents ... See more...
Hello I have a question about using python library in the algorithm of Splunk ML Toolkit. Open the ARIMA.py file in the path splunk/etc/apps/Splunk_ML_Toolkit/bin/algos as below. === Contents === [root@master algos]# pwd /opt/splunk/etc/apps/Splunk_ML_Toolkit/bin/algos [root@master algos]# [root@master algos]# more ARIMA.py #!/usr/bin/env python import datetime import pandas as pd import numpy as np from statsmodels.tsa.arima.model import ARIMA as _ARIMA from statsmodels.tools.sm_exceptions import MissingDataError ========================= Among the contents of ARIMA.py , it says import pandas aspd Where is Pandas bringing up the library in? When I run ARIMA.py as below, I get a message that the module is not found. === Execution Results === [root@master algos]# python3 ARIMA.py Traceback (most recent call last): File "ARIMA.py", line 5, in <module> import pandas as pd ModuleNotFoundError: No module named 'pandas' [root@master algos]#
i want to reset my spluk enterprise password 
Currently we connect to PostgreSQL database using username/password authentication. Now we need to switch to certificate based authentication. I've created certificate in the server. Can anyone plea... See more...
Currently we connect to PostgreSQL database using username/password authentication. Now we need to switch to certificate based authentication. I've created certificate in the server. Can anyone please guide me how to configure this in DBConnect  Web GUI?
failed to start kv store process. see mongod.log and splunkd.log for details. Plz help
Hello. I have created an index under a custom app from splunk web it is reflecting but we I have set up the univarsal forwarder to monitor logs for same index it is not reflecting anything on indexer... See more...
Hello. I have created an index under a custom app from splunk web it is reflecting but we I have set up the univarsal forwarder to monitor logs for same index it is not reflecting anything on indexer.   Also my kb store showing status failed and tell me to check mongod.log and splunk key   , please help in this
I've tried a few methods shared here to adjust the start/end times of span. Mainly: 1 -    | eval _time=_time-3600 | bin _time span=4h | eval _time=_time+3600   2 -   | timechart span=4h align... See more...
I've tried a few methods shared here to adjust the start/end times of span. Mainly: 1 -    | eval _time=_time-3600 | bin _time span=4h | eval _time=_time+3600   2 -   | timechart span=4h aligntime=@h-120m   However after testing, neither of these is actually offsetting the span. It only changes the times shown in the resulting table. The values (in my case counts) in each box do not change, just the _time values. Am I doing something wrong? For example: _time A B C 1/28 00:00 2 1 2 1/28 04:00 4 2 4 1/28 08:00 6 3 6 1/28 12:00 8 4 8 1/28 16:00 10 5 10   _time A B C 1/27 22:00 2 1 2 1/28 02:00 4 2 4 1/28 06:00 6 3 6 1/28 10:00 8 4 8 1/28 14:00 10 5 10
Greetings,  Are there any official AWS CFT Templates to create necessary roles, SNS/SQS Services to use Splunk Add on for AWS to ingest Cloudtrail Data into Splunk? 
Hi. Can I react router nested route in Splunk ui toolkit ? Overlapping routing with react-router results in an error page upon reload.