All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

hello So i want to make a search . i am using  index=endpoint_defender source="AdvancedHunting-DeviceInfo" | rex field=DeviceName "(?<DeviceName>\w{3}-\w{1,})." | eval DeviceName=upper(DeviceNa... See more...
hello So i want to make a search . i am using  index=endpoint_defender source="AdvancedHunting-DeviceInfo" | rex field=DeviceName "(?<DeviceName>\w{3}-\w{1,})." | eval DeviceName=upper(DeviceName) this gives me devicenames. now  | lookup snow_os.csv DeviceName output OS BuildNumber Version from this lookup i am comparing devicenames and as ouput i am getting OS BuildNumber Version. and from these fields i want to compare them to this lookup to get whether this Operating System is outdated or not. how can i do this ?    
What are the reasons for the slowness observed in the Splunk Mission Control incident review dashboard?
Hi all,  I have the following situation with a query returning a table of this kind: fieldA fieldB A 2 A 2 B 4 B 4   I need to add a column to this table that sums up field... See more...
Hi all,  I have the following situation with a query returning a table of this kind: fieldA fieldB A 2 A 2 B 4 B 4   I need to add a column to this table that sums up fieldB only once per fieldA unique value, meaning a new column that sums 2+4 = 6 table would look like this: fieldA fieldB sum_unique A 2 6 A 2 6 B 4 6 B 4 6   I know that I have to use | eventstats sum() here but I am struggling how to define it has to be once per fieldA unique value. Thanks in advance Miguel    
Where can i get list of all outdated OS for my dashboard. Is there a site or something
Hi all, I have custom apps for alert action and data inputs built in add-on builder. I require to re-build it but I don't have the export or original source exported from original add-on builder use... See more...
Hi all, I have custom apps for alert action and data inputs built in add-on builder. I require to re-build it but I don't have the export or original source exported from original add-on builder used for the development. Is there a way to do reverse engineering it or any way how I can get the original source for re-build? Thanks in advance for the answer/suggestion.    Splunk Add-on Builder 
Are there any plans to update this app?    
Environment: Splunk Enterprise 9.x (Windows, On-Prem) Domain: mydomain.duckdns.org (via DuckDNS) Certbot for Let’s Encrypt certificate generation Goal: Use the correct Certbot CLI comm... See more...
Environment: Splunk Enterprise 9.x (Windows, On-Prem) Domain: mydomain.duckdns.org (via DuckDNS) Certbot for Let’s Encrypt certificate generation Goal: Use the correct Certbot CLI command to generate certificates for Splunk HEC. Resolve curl: (28) Connection timed out when testing HTTPS. Specific Issues: 1. Certbot CLI and Certificate Handling The Let’s Encrypt README warns against copying/moving certificates, but Splunk requires specific paths. Question: What is the exact Certbot command to generate certificates for Splunk HEC on Windows? Should I copy fullchain.pem and privkey.pem to Splunk’s auth/certs directory despite the warnings? 2. HTTPS Curl Failure After configuring SSL in server.conf, curl times out:     Copy   Download curl -k -v "https://localhost:8088/services/collector" -H "Authorization: Splunk <HEC_TOKEN>" * Connection timed out after 4518953 milliseconds Question: Why does curl timeout even after enabling SSL in Splunk? Is localhost:8088 valid for testing, or must I use mydomain.duckdns.org:8088? Steps Taken: Generated certificates with certbot certonly --standalone -d mydomain.duckdns.org. Copied fullchain.pem and privkey.pem to $SPLUNK_HOME/etc/auth/certs. Configured server.conf: ini   Copy   Download [httpServer] enableSSL = true sslCertPath = $SPLUNK_HOME/etc/auth/certs/fullchain.pem sslKeyPath = $SPLUNK_HOME/etc/auth/certs/privkey.pem Confirmed port 8088 is open in Windows Firewall.
Hello, Is it possible to have only 1 Universal Forwarder installed on a Windows server and this UF sends data to 2 different Splunk instances Ex: 1- Source: IIS logs -> Dest = SplunkCloud 2- Sour... See more...
Hello, Is it possible to have only 1 Universal Forwarder installed on a Windows server and this UF sends data to 2 different Splunk instances Ex: 1- Source: IIS logs -> Dest = SplunkCloud 2- Source: event viewer data -> Dest = On Premise Splunk Enterprise If yes can you point to an article that help setup this? Other possible constraint: we have a deployment server that should allow to setup both flow.   Thanks for your help
Can anyone give me idea or script python to generate a diag file in splunk using python script login to splunk support portal and enter the case number Upload the file automatically 
Hello, This is Krishna and I have been some POC about accessing Splunk logs through Rest API's. I was successful in calling the Rest API's through Spunk Enterprise version but in my company we have ... See more...
Hello, This is Krishna and I have been some POC about accessing Splunk logs through Rest API's. I was successful in calling the Rest API's through Spunk Enterprise version but in my company we have Splunk Cloud and so unable to call Rest API's as how I was able to do in Splunk Enterprise edition. I would like to know the details of how I can call Splunk Rest API's for Cloud edition. Below are my findings On my local instance of Splunk when I hit the below url it lists all the services available https://localhost:8089/services(it asked me for admin credentials which I provided) in which I am interested in the https://localhost:8089/services/search/jobs  so would like to call the similar ones for Cloud version   Thanks in Advance.
Hello Splunk Community! Welcome to another week of fun curated content as a part of our Splunk Answers Community Content Calendar! This week's posts are about Regex extraction, Search & Reporting... See more...
Hello Splunk Community! Welcome to another week of fun curated content as a part of our Splunk Answers Community Content Calendar! This week's posts are about Regex extraction, Search & Reporting, and Splunk Enterprise. We will be highlighting some of our Splunk users and experts who contributed in the Splunk Search board. Splunk Community Spotlight: Extracting FQDN from JSON Using Regex Regex can be a tricky beast, especially when you're trying to extract specific values from structured fields like JSON. Fortunately, the Splunk community continues to come together to solve these common challenges. The Challenge Here we're highlighting a helpful interaction from the Splunk Search board on Splunk Answers, where Karthikeya asked for assistance refining a regex pattern to extract a fully qualified domain name (FQDN) from a JSON field. The Solution The solution was simple, effective, and more broadly applicable.  Gcusello explains that regex captures everything after the prefix (such as v-) up until the trailing port number, assigning it to the field fqdn. The solution was also accompanied by a link to Regex101, allowing others to test and validate the expression. This improved version is more readable and reliable across similar patterns. Field extraction is a foundational task in Splunk whether you're indexing logs, building dashboards, or writing alerts. Getting it right the first time helps ensure your data is usable and searchable in meaningful ways. Regex plays a crucial role in this, but without best practices, you may end up with inconsistent results. Thanks to contributors like gcusello, users gain cleaner, more maintainable solutions that benefit the entire community. Splunk Community Spotlight: Joining Data Across Indexes with Field Coalescing In the fast-paced world of network monitoring and security, Splunk users often need to correlate data across multiple indexes. In this week’s Splunk Search board, we’re diving into a great question raised by a MrGlass and how yuanliu and gcusello helped unlock the solution with a clean and efficient approach. The Challenge While trying to locate some data between two indexes, when running the search over a larger time window, the data becomes inconsistent, likely due to the structure of the join and differences in field naming. The Solution Gcusello provides a great solution and yuanliu refines it to help the user. Rather than renaming fields individually or relying on joins (which can break or jumble data across time windows), Both of them offered a streamlined approach using conditional logic to cleanly unify the data. Working with multiple data sources in Splunk is common, especially in environments where network telemetry and authentication logs live in different places. Field normalization using coalesce() ensures your queries remain flexible, efficient, and easier to manage at scale. Key Takeaways Splunk’s community is a powerful learning resource,  from regex optimizations to full-scale deployment strategies. If you’re ever stuck, don't hesitate to ask a question on Splunk Answers, and you might just find a simpler, better way to solve your challenge. Shout-Out Big thanks to Karthikeya & MrGlass for raising the issue and gcusello & yuanliu for providing such clean and effective solutions. This is the kind of collaborative problem-solving that makes the Splunk Community thrive! Would you like to feature more solutions like this? Reach out @Anam Siddique on Slack or @Anam on Splunk Answers to highlight your question, answer, or tip in an upcoming Community Content post! Beyond Splunk Answers, the Splunk Community offers a wealth of valuable resources to deepen your knowledge and connect with other professionals! Here are some great ways to get involved and expand your Splunk expertise: Role-Based Learning Paths: Tailored to help you master various aspects of the Splunk Data Platform and enhance your skills. Splunk Training & Certifications: A fantastic place to connect with like-minded individuals and access top-notch educational content. Community Blogs: Stay up-to-date with the latest news, insights, and updates from the Splunk community. User Groups: Join meetups and connect with other Splunk practitioners in your area. Splunk Community Programs: Get involved in exclusive programs like SplunkTrust and Super Users where you can earn recognition and contribute to the community. And don’t forget, you can connect with Splunk users and experts in real-time by joining the Slack channel. Dive into these resources today and make the most of your Splunk journey!
HI, I have my json message with 4-5 json key value pairs.   I want to remove some of the fields and want to modify body before send it to splunk server. In OTEL Server i tried using file log rec... See more...
HI, I have my json message with 4-5 json key value pairs.   I want to remove some of the fields and want to modify body before send it to splunk server. In OTEL Server i tried using file log receiver to modify body and transform log statements to set body. my json contians  Body: Str({"instant":{"epochSecond":1747736470,"nanoOfSecond":616797000},"thread":"ListProc-Q0:I0","level":"DEBUG", "message":"{actual log message}}) My requirement is, i want to remove instant, thread, level fields and want to send "json value of message field, which comes dynamically" Updated body is getting printed in debug log, but still splunk server is showing original body as is. transform:\n log_statements:\n - context: log\n statements:\n \ - 'set(body, ParseJSON(body)[\"message\"])'\n - 'set(body, \"extracted\")'\n But my splunk server is showing it as is original body.    Can some one please help me with this issue.  
I had a few questions over how the DMC gathered its information for server specifications and how to extract them.  I am trying to add resources to our slower indexers. I am trying to follow the off... See more...
I had a few questions over how the DMC gathered its information for server specifications and how to extract them.  I am trying to add resources to our slower indexers. I am trying to follow the official resource docs here.  To begin, I am viewing my infrastructure through the DMC view: Monitoring Console -> Settings -> General Setup   And from here, I get a view that looks like: In this view, my server is said to have 4 Cores and 15884 MB worth of memory. These are Azure VMs, so they use vCPUs. This one specifically is an Azure VM indexer. We have latency issues with these and I believe it's due to it being under resourced.    Looking at my server specifically, I get the following CPU  specs:   From looking at the other hosts, from both sources, I have reached the conclusion that cores = cores x sockets (this makes sense) From some references I was looking at, it seemed like i needed to be looking at the CPU(s) value from the lscpu command. For this example, this server has 8 vCPU, and its recommended we have at least 12.  When upgrading, do i need to focus on the cores, or do i just need to specify how many vCPUs i need. I also wanted to know how I could extract this view from the DMC. I want to table all of the resource metrics so I can gather to get a full picture of what I have in my Splunk environment.  Thank you for any guidance or insight. 
Hi Splunk Community, I’m working on a use case where data is stored in Elasticsearch, and I’d like to use Splunk solely as an interface for visualizing and querying the data using SPL (Search Proces... See more...
Hi Splunk Community, I’m working on a use case where data is stored in Elasticsearch, and I’d like to use Splunk solely as an interface for visualizing and querying the data using SPL (Search Processing Language) — without ingesting or storing the data again in Splunk, to avoid duplication and unnecessary storage costs. My main questions are: Is there a way to connect Splunk directly to Elasticsearch as an external data source? Can Splunk query external data (like from Elasticsearch) using SPL, without indexing it? Are there any available add-ons, modular inputs, or scripted solutions that allow this type of integration? Is this approach officially supported by Splunk, or would it require a custom integration? I’m aware that tools like Logstash or Kafka can be used to bring data into Splunk, but that’s exactly what I’m trying to avoid — I don’t want to duplicate the data storage. If anyone has experience with a similar setup, or any recommendations, I’d greatly appreciate your input. Thanks in advance!  
Hello, I have a Search that is taking 5 min to complete when looking at only the last 24 hrs.  If possible, could someone help me figure out how I can improve this Search?  I am in need of deduping ... See more...
Hello, I have a Search that is taking 5 min to complete when looking at only the last 24 hrs.  If possible, could someone help me figure out how I can improve this Search?  I am in need of deduping by SessionId and combing  3 fields into a single field. source="mobilepro-test" | dedup Session.SessionId | strcat UserInfo.UserId " " Location.Site " " Session.StartTime label | table Session.SessionId, label It looks like it's the dedup that is causing the slowness, but I have no idea how to improve that. Thanks for any help on this one, Tom
Hi, We want to connect Splunk to SAP Hana Database. Have you some idea ? Do we use ngdbc.jar and  Put that driver in: $SPLUNK_HOME/etc/apps/splunk_app_db_connect/drivers Regards.
Need assistance to create diag file on splunk edge processor
We have a lab Splunk deployment with the following specification: 3 indexers in an indexer cluster 1 SH for normal searches 1 SH with ITSI installed 1 SH with Enterprise Security installed 1 se... See more...
We have a lab Splunk deployment with the following specification: 3 indexers in an indexer cluster 1 SH for normal searches 1 SH with ITSI installed 1 SH with Enterprise Security installed 1 server that acts as the Cluster manager for the indexers and as the License manager We have NFR licenses (Enterprise, ITSI) installed on the License manager and all the other servers are configured as license peers. With the above setup the problem is that the ITSI license doesn't work, and we only get IT Essential Works. When the ITSI license is installed directly on the ITSI server ITSI is working correctly (but the other licences don't apply in this case, because those are installed on the License manager). We installed the required applications (SA-ITSI-Licensechecker and SA-UserAccess) on the License manager as per the official documentation.  Did anyone encounter similar problem and if so, what was the solution?
Hello Splunkers, Good Day! I'm getting this error consistent. Out of confusion, those this mean it's the estimated KVStore size? max_size_per_result_mb value is current set by default (50 MB).  KVS... See more...
Hello Splunkers, Good Day! I'm getting this error consistent. Out of confusion, those this mean it's the estimated KVStore size? max_size_per_result_mb value is current set by default (50 MB).  KVStorageProvider [123456789 TcpChannelThread] - Result size too large, max_size_per_result_mb=52428800, Consider applying a skip and/or limit   Thanks!
Hello,   We use Splunk Enterprise  9.3.2 and LDAP Integration We Granted and AD Group 90 capabilies in ITSI to cover above analyst role so they can create correaltion searches ,episodes and polici... See more...
Hello,   We use Splunk Enterprise  9.3.2 and LDAP Integration We Granted and AD Group 90 capabilies in ITSI to cover above analyst role so they can create correaltion searches ,episodes and policies but not delete them. These particular users are having error :   Does anyone know why access gets blocked