All Topics

Top

All Topics

  i have a table with values and based on the input checklist selection i want to display the table rows i have a checkbox option enabled in the panel so that users can select the checkbox and acc... See more...
  i have a table with values and based on the input checklist selection i want to display the table rows i have a checkbox option enabled in the panel so that users can select the checkbox and accordingly table displays the value greater than 100 between 10 to 100 less than 10 How can i use the conditional operator in the input type as i try to add the value > or < in the input the search doesn't work in the panel  
Hi Team,  Need some help, while running below query I get host IP i.e. 10.65.x.x in Number display visualization but I need to replace with name "xyz" index=network ((host=10.65.x.x) AND ... See more...
Hi Team,  Need some help, while running below query I get host IP i.e. 10.65.x.x in Number display visualization but I need to replace with name "xyz" index=network ((host=10.65.x.x) AND ((Interface Ethernet1/50 is *) OR (Interface Ethernet1/49 is *) OR (Interface Ethernet1/3 is *))) | table message_text, host Attached is the screenshot Can you assist me what needs to be done to solve this issue.
Hello,  I have big and complete log and want to extract specific value.  Small part of log: "state":{"running":{"startedAt":"2024-12-19T13:58:14Z"}}}], I would like to extract running in this case... See more...
Hello,  I have big and complete log and want to extract specific value.  Small part of log: "state":{"running":{"startedAt":"2024-12-19T13:58:14Z"}}}], I would like to extract running in this case, value can be other .  Could you please help me ? 
I cloned HTTP traffic collection from Splunk Stream and created a new name as HTTP_test but no data is collected. However, data is currently being collected from Stream rules that collect HTTP data... See more...
I cloned HTTP traffic collection from Splunk Stream and created a new name as HTTP_test but no data is collected. However, data is currently being collected from Stream rules that collect HTTP data. Is there a reason why the same item is not collected even though it is cloned?
Hi everyone, I’m new to working with Citrix NetScaler and need assistance with integrating it into Splunk Enterprise. Could someone please guide me on: The prerequisites required for this integrat... See more...
Hi everyone, I’m new to working with Citrix NetScaler and need assistance with integrating it into Splunk Enterprise. Could someone please guide me on: The prerequisites required for this integration. The exact steps to follow for a successful setup and comprehensive data coverage. Any detailed insights or documentation links would be greatly appreciated. and please let me know when it required to use Splunk dashboards or visualization apps for NetScaler data   Thank you! Splunk Add-on for Citrix NetScaler 
Hi,   I have three license keys for Splunk SOAR and Splunk UBA, each valid for one year. While I am able to install the keys on both SOAR and UBA, I would like to verify all the keys I have install... See more...
Hi,   I have three license keys for Splunk SOAR and Splunk UBA, each valid for one year. While I am able to install the keys on both SOAR and UBA, I would like to verify all the keys I have installed, identify which key is currently active, and check their expiration dates.   Thank you
Hi      I have deployed Splunk enterprise and my logs are getting ingested into the indexer. Now i have created an app for enriching the logs with additional fields from a csv file. I have deployed ... See more...
Hi      I have deployed Splunk enterprise and my logs are getting ingested into the indexer. Now i have created an app for enriching the logs with additional fields from a csv file. I have deployed the app by making configuration changes in props.conf and transforms.conf and i am able to view search time enrichment. But my requirement is real time enrichment as my csv file would change every 2 days. Can anyone provide a sample configuration for props.conf and transforms.conf for real time enrichment of logs with fields from csv based on match with one of the fields of the logs. Regards
  Hello everyone, I’m trying to send SPAN traffic from a single interface (ens35) to Splunk Enterprise using the Splunk Stream forwarder in independent mode. The Splunk Stream forwarder and the sea... See more...
  Hello everyone, I’m trying to send SPAN traffic from a single interface (ens35) to Splunk Enterprise using the Splunk Stream forwarder in independent mode. The Splunk Stream forwarder and the search head appear to be connected properly, but I’m not seeing any of the SPAN traffic in Splunk. In the stmfwd.log, I see the following error: (CaptureServer.cpp:2032) stream.CaptureServer - NetFlow receiver configuration is not set in streamfwd.conf. NetFlow data will not be captured. Please update streamfwd.conf to include correct NetFlow receiver configuration. However, I’m not trying to capture NetFlow data; I only want to capture the raw SPAN traffic. Here is my streamfwd.conf: [streamfwd] httpEventCollectorToken = xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx indexer.1.uri = http://splunk-indexer:8088 indexer.2.uri = http://splunk-indexer2:8088 streamfwdcapture.1.interface = ens35 Why is the SPAN traffic not being forwarded to Splunk? How can I configure Splunk Stream properly so that it captures and sends the SPAN traffic to my indexers without any NetFlow setup? Thank you!
I registered for the 14-day Free Trial of Splunk Cloud Platform. I registered my email address and verified it. I expected to receive an email entitled "Welcome to Splunk Cloud Platform" with c... See more...
I registered for the 14-day Free Trial of Splunk Cloud Platform. I registered my email address and verified it. I expected to receive an email entitled "Welcome to Splunk Cloud Platform" with corresponding links from which to access and use a trial version of Splunk Cloud. That email never arrived after several hours. No evidence of it exists either in my "Spam" or "Trash" folders of my inbox. Please look into this and advise. Thanks! -Rolland
Hello, I have a report scheduled every week and the results are exported to pdf's. Is there an option to NOT email if no results are found because sometimes these PDF's have nothing in them.   Tha... See more...
Hello, I have a report scheduled every week and the results are exported to pdf's. Is there an option to NOT email if no results are found because sometimes these PDF's have nothing in them.   Thanks
I have the Splunk Add-on for Google Cloud Platform set up on an IDM server.  I am currently on version 4.4 and have inputs set up already from months ago however, I am trying to send more data to spl... See more...
I have the Splunk Add-on for Google Cloud Platform set up on an IDM server.  I am currently on version 4.4 and have inputs set up already from months ago however, I am trying to send more data to splunk but for some reason the inputs page does not load anymore.  The connection seems to be fine as I am still receiving expected data from my previous inputs but now when i try to add another input I get the following error: Failed to Load Inputs Page This is normal on Splunk search heads as they do not require an Input page. Check your installation or return to the configuration page. Details AxiosError: Request failed with status code 500
Hi All, I am using the base search and post-process searches outlined below, along with additional post-process searches in my Splunk dashboard. The index name and fields are consistent across all t... See more...
Hi All, I am using the base search and post-process searches outlined below, along with additional post-process searches in my Splunk dashboard. The index name and fields are consistent across all the panels. I have explicitly included a fields command to specify the list of fields required for the post-process searches. However, I am observing a discrepancy: the result count in the Splunk search is higher than the result count displayed on the Splunk dashboard. Could you help me understand why this is happening ? base search:- index=myindex TERM(keyword) fieldname1="EXIT" | bin _time span=1d | fields _time, httpStatusCde, statusCde, respTime, EId Post process search1:- | search EId="5eb2aee9" | stats count as Total, count(eval(httpStatusCde!="200" OR statusCde!="0000")) as failures, exactperc95(respTime) as p95RespTime by _time | eval "FailureRate"= round((failures/Total)*100,2) | table _time, Total, FailureRate, p95RespTime | sort -_time Post process search2:- | search EId="5eb2aee8" | stats count as Total, count(eval(httpStatusCde!="200" OR statusCde!="0000")) as failures, exactperc95(respTime) as p95RespTime by _time | eval "FailureRate"= round((failures/Total)*100,2) | table _time, Total, FailureRate, p95RespTime | sort -_time
Hi I have an ask to create an alert that must trigger if there are more than 50 '404' status codes in a 3 min period. This window must repeat three times in a row - for e.g 9:00 - 9:03, 9:03 - 9:06, ... See more...
Hi I have an ask to create an alert that must trigger if there are more than 50 '404' status codes in a 3 min period. This window must repeat three times in a row - for e.g 9:00 - 9:03, 9:03 - 9:06, 9:06 - 9:09.   The count should trigger only for those requests with 404 status code and for certain urls. The alert must only trigger if there are three values over 50 in consecutive 3 min windows. I have some initial SPL not using streamstats, but was wondering if streamstats would be better? Initial SPL - run over a 9 min time range: index="xxxx" "httpMessage.status"=404 url = "xxxx/1" OR url="xxxx/2" OR url ="xxxx/3" | timechart span=3m count(httpMessage.status) AS HTTPStatusCount | where HTTPStatusCount>50 | table _time HTTPStatusCount   thanks.
I'm trying to estimate how much would my Splunk Cloud setup cost me given my ingestion and searches. I'm currently using Splunk with a free license (Docker) and I'd like to get a number that represe... See more...
I'm trying to estimate how much would my Splunk Cloud setup cost me given my ingestion and searches. I'm currently using Splunk with a free license (Docker) and I'd like to get a number that represents either the price or some sort of credits. How can I do that? Thanks.
Below are 2 queries which returns different events but have a common field thread_id which can be taken by using below rex.  raw message logs are different for both queries. I want events list with... See more...
Below are 2 queries which returns different events but have a common field thread_id which can be taken by using below rex.  raw message logs are different for both queries. I want events list with raw message logs from both query but only if each raw message has this common thread_id I have tried multiple things like join, append, map and github copilot as well but not getting the desired results. Can somebody please help on how to achieve this.    rex field=_raw "\*{4}(?<thread_id>\d+)\*" index="*sample-app*" ("*504 Gateway Time-out*" AND "*Error code: 6039*") index="*sample-app*" "*ExecuteFactoryJob: Caught soap exception*"     index="*wfd-rpt-app*" ("*504 Gateway Time-out*" AND "*Error code: 6039*") | rex field=_raw "\*{4}(?<thread_id>\d+)\*" | append [ search index="*wfd-rpt-app*" "*ExecuteFactoryJob: Caught soap exception*" | rex field=_raw "\*{4}(?<thread_id>\d+)\*" ] | stats values(_raw) as raw_messages by _time, thread_id | table _time, thread_id, raw_messages     I tried above query but it is returning some results which is correct which contains raw message from both the queries, but some results are there which contains thread id and only the 504 gateway message even though the thread_id has both type of message when I checked separately. I'm new to splunk, any help is really appreciated.
Hello Everyone, I'm trying to add index filtering for the datamodels in my setup. I found for some datamodels such as Vulnerabilities, there's no matching data at all.  In this case, should I create... See more...
Hello Everyone, I'm trying to add index filtering for the datamodels in my setup. I found for some datamodels such as Vulnerabilities, there's no matching data at all.  In this case, should I create an empty index for these datamodels? so that splunk won't do useless search for them. Please also know me if there are better solution for this case. Thanks & Regards, Iris
Hello, I’m working on creating a Splunk troubleshooting Dashboard for our internal team, who we are new to Splunk, to troubleshoot forwarder issues—specifically cases where no data is being received... See more...
Hello, I’m working on creating a Splunk troubleshooting Dashboard for our internal team, who we are new to Splunk, to troubleshoot forwarder issues—specifically cases where no data is being received. I’d like to know the possible ways to troubleshoot forwarders when data is missing or for other related issues. Are there any existing dashboards I could use as a reference? also, what are the key metrics and internal index REST calls that I should focus on to cover all aspects of forwarder troubleshooting?  #forwarder #troubleshoot #dashboard
Hi All, I am searching UiPath Orchestrator Logs in Splunk as following:   index="<indexname>" source = "user1" OR source = "user2" "<ProcessName>" "Exception occurred" | rex field=message "(?<dyna... See more...
Hi All, I am searching UiPath Orchestrator Logs in Splunk as following:   index="<indexname>" source = "user1" OR source = "user2" "<ProcessName>" "Exception occurred" | rex field=message "(?<dynamic_text>jobId:\s*\w+)" | search dynamic_text!=null | stats values(dynamic_text) AS extracted_texts | map search="index="<indexname>" source = "user1" OR source = "user2" dynamic_text=\"$extracted_texts$\""   with my above search, I'll have to reference the jobId matched field from the first search to get other matching records to process transaction details Thanks a lot in advance!
We are using the Salesforce Add-On 4.8.0 and we have the username lookup enabled and it seems to be working properly except for the user type.  It is setting all users to standard.  Has anyone seen t... See more...
We are using the Salesforce Add-On 4.8.0 and we have the username lookup enabled and it seems to be working properly except for the user type.  It is setting all users to standard.  Has anyone seen this or know how to fix it?
I'm trying to find a simple way to calculate the product of a single column, e.g. value_a 0.44 0.25 0.67 Ideally, I could use something like this: | stats product(value_a)  But thi... See more...
I'm trying to find a simple way to calculate the product of a single column, e.g. value_a 0.44 0.25 0.67 Ideally, I could use something like this: | stats product(value_a)  But this doesn't seem possible.