All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, all I took a trial for 14 days, set up integration with Kubernetes (EKS) and tried to configure 4 services (of the same type, Java) to send traces. Installed Helm-chart with splunk-otel-colle... See more...
Hi, all I took a trial for 14 days, set up integration with Kubernetes (EKS) and tried to configure 4 services (of the same type, Java) to send traces. Installed Helm-chart with splunk-otel-collector + integrated splunk-otel-javaagent.jar (all according to the instructions in the section Integration) All services have the same configuration.  But 1 service send traces, other 3 - not.  (BTW, early the same 4 services in DataDog give traces, without problems) There are no errors in the logs similar to these: https://github.com/signalfx/splunk-otel-java/blob/main/docs/troubleshooting.md Have any ideas for checks ?  Thanks
While trying to change password on a Splunk HF, and getting below error /opt/splunk/bin/splunk edit user admin -password new_passowrd -role admin -auth admin:changeme ---> "Couldn't complete HTTP... See more...
While trying to change password on a Splunk HF, and getting below error /opt/splunk/bin/splunk edit user admin -password new_passowrd -role admin -auth admin:changeme ---> "Couldn't complete HTTP request: Connection reset by peer error" Your help would be much apprecaited 
Hello Guys I'm trying to ingest exported sysmon logs file to Splunk. I got the file from Splunk attack_data repository. I have already installed Microsoft sysmon add-ons. Splunk attack_data's lin... See more...
Hello Guys I'm trying to ingest exported sysmon logs file to Splunk. I got the file from Splunk attack_data repository. I have already installed Microsoft sysmon add-ons. Splunk attack_data's link:    Every time when I choose xmlWinEventLog:Microsoft-Windows-Sysmon/Operational as a source type, it gives me error Not found.  appreciate your support, how can I ingest exported sysmon logs to splunk?   Thanks, Awni
Rollback during Installation Splunk Enterprise in Windows 64 bit Please i need the help.  
Hi All, One of our new client interested to use Splunk tool to monitor their application.  To setup Splunk for their application what are the initial details we need to ask to Client to configure S... See more...
Hi All, One of our new client interested to use Splunk tool to monitor their application.  To setup Splunk for their application what are the initial details we need to ask to Client to configure Splunk. Like which OS they use, How many servers, which cloud , which database, etc.. If there is any template available to get the basic required details from client kindly share with me. or what are the specific details we need to request them. Thanks in advance. 
Hi All, Currently we are using 3 Heavy Forwarder in Windows server. Due to budget problem we are planning to move all HF to Linux server.  Kindly guide and suggest how to move HF from Windows to Li... See more...
Hi All, Currently we are using 3 Heavy Forwarder in Windows server. Due to budget problem we are planning to move all HF to Linux server.  Kindly guide and suggest how to move HF from Windows to Linux.  How to copy to already installed apps and existing settings and configuration files from windows to Linux? Thank in advance for your reply.  
Hello ,  I need to find which is the limit of user that can  be online using Splunk Enterprise at the same time ; I have a search head cluster of 4 SH and 1 balancer  thanks 
Do all HOT buckets of one indexer migrate to WARM buckets and create small buckets because the connection between the indexer and the cluster master was broken?
Hello: I am trying to get fields from different events in the same table. I have two different events, and let's say they have these fields: First event: Field1 = A Field2 = B Second even... See more...
Hello: I am trying to get fields from different events in the same table. I have two different events, and let's say they have these fields: First event: Field1 = A Field2 = B Second event: Field1 = A Field3 = C So if I run the following:  index=whatever sourcetype=whatever | table Field1 Field2 Field3 I get a table like such: Field1               Field2              Field3 A                             B A                                                       C   I am trying to get the table to look like this, because Field1 is the same value: Field1                       Field2                   Field3 A                                     B                               C Basically, I am trying to pull a value from one event where the message IDs or session IDs are unique, and have Splunk go find another event with matching message IDs, and grab a different value from that separate event and output it to the same row in a table so the values in the table correspond with their respective message IDs.
Let's say I have data in an event that looks like this:       NAME: John NAME: Mary NAME: Sue       Assuming I have no idea how many names will exist in the event, is it possible to ... See more...
Let's say I have data in an event that looks like this:       NAME: John NAME: Mary NAME: Sue       Assuming I have no idea how many names will exist in the event, is it possible to use the rex command to parse out all the names and display them in separate fields? Thanks, Jonathan
Hi there!  I'm wondering if anyone out there has experience with using Data Manager for Azure onboarding. According to this link https://docs.splunk.com/Documentation/DM/1.7.0/User/GDIOverview#Gett... See more...
Hi there!  I'm wondering if anyone out there has experience with using Data Manager for Azure onboarding. According to this link https://docs.splunk.com/Documentation/DM/1.7.0/User/GDIOverview#Getting_data_in_for_Microsoft_Azure it shows that there are only TWO supported sourcetypes, azure:monitor:aad and azure:monitor:activity. The searches for Enterprise Security Analytic Stories for Azure uses a macro named azuread which is looking for a specific sourcetype (mscs:azure:eventhub).  Does DM contain that sourcetype needed for the ES stories?  Or will I still need to be ingesting eventhub via the Splunk Add-on for Microsoft Cloud Services TA?        
Hi, I am facing an issue with the eval if condition. Please help.   index=main, source=ls.csv | eval new_field = if(error=200,"sc","cs",if(error=500,"css","ssc")) | table error new_field   ... See more...
Hi, I am facing an issue with the eval if condition. Please help.   index=main, source=ls.csv | eval new_field = if(error=200,"sc","cs",if(error=500,"css","ssc")) | table error new_field     Regards Suman P.
Dear All, I have created a TA to monitor a custom python script named log_parser_v1.py". Here is the configuration from /splunk/etc/apps/TA-logs/default/inputs.conf [script://./bi... See more...
Dear All, I have created a TA to monitor a custom python script named log_parser_v1.py". Here is the configuration from /splunk/etc/apps/TA-logs/default/inputs.conf [script://./bin/log_parser_v1.py] python.version = python3.9 interval = 300 disabled = false But while running TA got failed with the error "ModuleNotFoundError: No module named 'syslog'" So I am trying to debug with splunk cmd python, and it's throwing "ModuleNotFoundError: No module named 'syslog'" error - [ss@localhost bin]$ ./splunk cmd python log_parser_v1.py Traceback (most recent call last): File "bin/log_parser_v1.py", line 7, in <module> import syslog ModuleNotFoundError: No module named 'syslog' But the same script runs fine with the command python3.9 bin/log_parser_v1.py Here are the few lines from the script with the import statement of the module "syslog" in the line 7- [ss@localhost bin]$ cat log_parser_v1.py #!/usr/bin/env python import os, sys sys.path.append('/usr/bin/python3.9') sys.path.append('/usr/lib/python3.9/site-packages') sys.path.append('/usr/lib64/python3.9/site-packages') sys.path.append(os.path.dirname(os.path.abspath(__file__))) import json, logging, syslog, datetime, argparse, shutil, zipfile, tarfile, bz2, socket, sys, errno, time, gzip, hashlib from logging.handlers import SysLogHandler, SYSLOG_TCP_PORT from syslog import LOG_USER To use python3.9. I append the python3.9 package path in script but it still is not picking the syslog module. here is the python3.9 path - [ss@localhost bin]$ whereis python python: /usr/bin/python2.7 /usr/bin/python3.6 /usr/bin/python3.6m /usr/bin/python3.9 /usr/lib/python2.7 /usr/lib/python3.6 /usr/lib/python3.9 /usr/lib64/python2.7 /usr/lib64/python3.6 /usr/lib64/python3.9 /usr/include/python3.9 /usr/include/python2.7 /usr/include/python3.6m /usr/share/man/man1/python.1.gz I also tried to import syslog package with ./splunk cmd python, but it got failed [ss@localhost bin]$ ./splunk cmd python Python 3.7.11 (default, May 25 2022, 12:23:55) [GCC 9.1.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import sys >>> import syslog Traceback (most recent call last): File "<stdin>", line 1, in <module> ModuleNotFoundError: No module named 'syslog' >>> exit() And here is imported successfully with python3.9 [ss@localhost bin]$ python3.9 Python 3.9.7 (default, Sep 13 2021, 08:18:39) [GCC 8.5.0 20210514 (Red Hat 8.5.0-3)] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import syslog >>> exit() Guys, I am looking for your help to understand like what is missing. please help here.
Hi Community,   I have a use case where the client needs data to be stored over an extended period of time. That data powers the dashboard that uses datamodels to generate the panels. Since the cl... See more...
Hi Community,   I have a use case where the client needs data to be stored over an extended period of time. That data powers the dashboard that uses datamodels to generate the panels. Since the client wants data to be available for at least 6 months, the idea was to create an index that has hot/warm buckets in SSD and cold buckets in slower storage. I have two different issues here: I have implemented this setup in our test environment with mixed storage for hot and cold buckets. Is there a way for me to check where my data is being stored? Since my dashboards are all powered by datamodels, I have a question regarding the storage location and method of accelerated data. If the data is accelerated, does the data model summary folder store the complete accelerated data or will it have some pointers that point to the location where the data is actually present? The main problem here is that if we have mixed storage of SSD and HDD, and since all the dashboards are powered by datamodels how much will this affect the performance of Splunk? Will the time to load the dashboard be affected by such a storage model?   Regards, Pravin
I am trying to get a wildcard to work with a where clause. Not sure if I'm doing something wrong altogether or just missing some syntax but my search is as follows:   index=my_index | where descr... See more...
I am trying to get a wildcard to work with a where clause. Not sure if I'm doing something wrong altogether or just missing some syntax but my search is as follows:   index=my_index | where description=" Changed * role to * Admin"   basically looking up whether any user had their role changed to any admin role. Thought this would be an easy one, and it is not.
Does anyone know if it's possible to create a cluster for the deployment server or the master server? I´m asking this because we could go to DR more easily in case of datacenter change, tests or di... See more...
Does anyone know if it's possible to create a cluster for the deployment server or the master server? I´m asking this because we could go to DR more easily in case of datacenter change, tests or disasters. Also our deployment server is quite slow (we have more than 5000 universal forwarders). I think a deployment server cluster could solve this  issue. Anyone have any idea? Is it possible? 
Hi, One thing that doesn't seem to be documentet, is how Splunk handles Linux file permissions when files from the deployer is pushed to the search head cluster. Docs: https://docs.splunk.com/Docum... See more...
Hi, One thing that doesn't seem to be documentet, is how Splunk handles Linux file permissions when files from the deployer is pushed to the search head cluster. Docs: https://docs.splunk.com/Documentation/Splunk/9.0.2/DistSearch/PropagateSHCconfigurationchanges For example, I have an app "/opt/splunk/etc/shcluster/apps/my app". This app has a script under "/opt/splunk/etc/shcluster/apps/my app/bin/helloworld.sh". This script has the permission "-rwxr-x---" on the deployer, but if I push the script to the search head cluster it gets the permission "-rw-rw-r--" on the search head cluster members. Note that the executable permission is removed, making the script not usable. I'm using Splunk version 9.0.2 on both the deployer and the search head cluster members. Also, a colleague of mine is having the same problem, so I don't think is something wrong with my Splunk environment in particular. Is anyone else experiencing this problem, and is there a workaround?
Hi, I am using the following script in Splunk query. Here i am trying having multiple values in field AdditionalData and splitting them to extract the fields and writing to separate fields. Now i... See more...
Hi, I am using the following script in Splunk query. Here i am trying having multiple values in field AdditionalData and splitting them to extract the fields and writing to separate fields. Now if there is any blank value, in any of these extract fields, i want to have the default value appear as Not Available.  Thanks in advance | eval "AddtionalData"=if(isnotnull('cip:AuditMessage.ExtraData'),'cip:AuditMessage.ExtraData',"Not Available") | rex field=AddtionalData "Legal employer name:(?<LegalEmployerName>[^,]+)" | rex field=AddtionalData "Legal entity:(?<LegalEntity>[^,]+)" | rex field=AddtionalData "Country:(?<Country>[^,]+)" | rex field=AddtionalData "Business unit:(?<BusinessUnit>[^,]+)"
Hello everyone! I have recently started working with SPLUNK UI and I'm getting used to all the options it has. I guess my question may be a little dumb, but got stuck and I'm not finding any docum... See more...
Hello everyone! I have recently started working with SPLUNK UI and I'm getting used to all the options it has. I guess my question may be a little dumb, but got stuck and I'm not finding any documentation that helps me go through it. At the moment, I have my app runing and I have a Dashboard page where I'm able to load some Visualizations components. My issue comes at the time of importing the indexes where I have to read data from. Where can I find the location of those indexes and how should I state the import for reading the data on them? Any link to documentation regarding this or help would be highly appreciated.
Hello, I have a collection of logs (same source type) but some of them have different or additional fields. In order to figure out when they appear, I'm trying to create a Query that shows me which ... See more...
Hello, I have a collection of logs (same source type) but some of them have different or additional fields. In order to figure out when they appear, I'm trying to create a Query that shows me which fields are distinct after a specific time range. Let's say I have 200 events from 13:00 to 14:00. Now I want to group by stats values(*) results by creating timerangefields:   | eval timerange1=(13:00 to 13:15), timerange2=(13:15 to 13:30)   so I can use    |stats values(*) by timerange1, timerange2    I was considering using date_hour, date_minute etc.. but I think there must be an easier way as I would need addititional commands. Also I don't know the right format as I get everytime "Type checking failed. '-' only takes numbers. So do you have any suggestions how I could solve this? I'm thankful for any help   Kind regards Alex