All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

was using this below Search,  ***| rex field=_raw "<measResults>\d+\s\d+\s\d+\s\d+\s\d+\s\d+\s\d+\s\d+\s(?<active_state>\d{0,3})\s\d+\s(?<idle_state>\d{0,3})" | eval date_month=upper(date_month) | ... See more...
was using this below Search,  ***| rex field=_raw "<measResults>\d+\s\d+\s\d+\s\d+\s\d+\s\d+\s\d+\s\d+\s(?<active_state>\d{0,3})\s\d+\s(?<idle_state>\d{0,3})" | eval date_month=upper(date_month) | eventstats avg(active_state) as Active_UEs avg(idle_state) as Idle_UEs by date_month | eval Active_UEs=round(Active_UEs,0), Idle_UEs=round(Idle_UEs,0) | stats count by date_month,Active_UEs,Idle_UEs | table date_month,Active_UEs,Idle_UEs but now  i was trying to sort month in chronological order i used the below Search ***| eventstats avg(active_state) as Active_UEs avg(idle_state) as Idle_UEs by date_month | eval Active_UEs=round(Active_UEs,0), Idle_UEs=round(Idle_UEs,0) | eval Month=date_month | eval orden = if(Month="january",1,if(Month="february",2,if(Month="march",3,if(Month="april",4,if(Month="may",5,if(Month="june",6,if(Month="july",7,if(Month="august",8,if(Month="september",9,if(Month="october",10,if(Month="november",11,12))))))))))) | sort num(Month) | stats count by Month,Active_UEs,Idle_UEs | table Month,Active_UEs,Idle_UEs here also month are sorted in alphabetically order not in chronilogical order.        
Hi everyone, I've a scenario where Splunk is timing out in querying customer SIEM environments and reporting as potential dropped logs. If I check in the customer's SIEM, I see that there are no dr... See more...
Hi everyone, I've a scenario where Splunk is timing out in querying customer SIEM environments and reporting as potential dropped logs. If I check in the customer's SIEM, I see that there are no dropped logs, so I know the issue is to do with Splunk querying their environment.  Knowing that, I am trying to craft an SPL that looks for searches that were canceled or timed out. Can someone help me with this? What I've tried using is below but it's not accurately correlating to the time stamps I am seeing in the ticket for my customer. Can someone help me build this out more.   index=<customer name> cancelled  
Hi All, Greetings! Need help on splunk query, I have 2 indexes assets and vulns, am trying to build report to analyze percent % assets are not scanned, with below query am getting results perce... See more...
Hi All, Greetings! Need help on splunk query, I have 2 indexes assets and vulns, am trying to build report to analyze percent % assets are not scanned, with below query am getting results percent, but some of the ip's are having duplicate entries which are showing as as scanned and not scanned for same ip, i need query to remove from SCANNED =0 if the same ip SCANNED=1, I tried using dedup but since its removing randomly so scanned ip's also getting removed.  Please help me with correct query ((index=index1 sourcetype=asset) OR (index=index1 sourcetype=vulns)) | eval vuln=if('sourcetype'="vulns","yes","no") | eval assets=if('sourcetype'="asset","yes","no") | stats max(eval(if(vuln="yes",1,0))) AS SCANNED max(eval(if(assets="yes",1,0))) AS ASSETS latest(ip) as ip by uuid | search ASSETS=1 | stats count(eval( SCANNED > 0)) AS scanned, count(uuid) as total | eval percent = round((scanned/(total))*100,2) Result example scanned ASSETS  ip 0 1 192.168.1.1 1 1 192.168.1.1   Thanks!
Hi, Can someone assist how to integrate Dell OpenManage Enterprise with Splunk.
Hello Splunkers, For a specific index I configured repFactor = auto and I suppose that the logs are exactly the same on my two indexers for this specific index. How could I verify that all buckets... See more...
Hello Splunkers, For a specific index I configured repFactor = auto and I suppose that the logs are exactly the same on my two indexers for this specific index. How could I verify that all buckets and data has been correctly replicated ?  When I launch a search on that index, should I see the splunk_server field showing 50% of indexer1 and 50% of indexer2 ? Thanks for the help, GaetanVP 
I have a question  How I can send the SNMP logs of the FortiGate firewall to splunk?    Can any one help ?? 
Currently we are ingesting a big amount of AWS VPC FlowLogs in to the Splunk and I am wondering if there is any usage of them?  Maybe someone has to suggest some use cases for them? 
We have registered to the splunk on call trial and configured our Production instances to splunk so that if something is wrong, we get a phone call. The trial ended and we are trying to actually pay ... See more...
We have registered to the splunk on call trial and configured our Production instances to splunk so that if something is wrong, we get a phone call. The trial ended and we are trying to actually pay for the service but we are getting payment errors. We have sent emails to the billing email, but no ones has responded after multiple follow ups. I tried calling the support number and no one has been responding for the past 2 days. I left a message on the voice mail. Below is the error we get when attempting a payment. code: 51000021 message: Invalid parameter(s): 'requestId'.    
In few logs I can see escape character is also printed. My rex is working fine when i am testing it on regex101.com but when i use the same in Splunk Search, its throwing error. I tried different com... See more...
In few logs I can see escape character is also printed. My rex is working fine when i am testing it on regex101.com but when i use the same in Splunk Search, its throwing error. I tried different combination by putting quotes but then different error comes. Regex: https://regex101.com/r/Nm32kd/2 Splunk error:   
I have installed eventid.net but it keeps saying it is not configured yet.  What am i missing in the config.  I attached screenshots to assist.
Hi ,   We are getting lot of events into our Splunk which is filling up the our computer disk storage rapidly.   Our search query is “index="cloud" sourcetype=sls_sourcetype”     Upon i... See more...
Hi ,   We are getting lot of events into our Splunk which is filling up the our computer disk storage rapidly.   Our search query is “index="cloud" sourcetype=sls_sourcetype”     Upon investigating we decided that we need to capture only below events; Collect only below events Where field name “sql” contains below 4 values ;   logout! login success! login failed! create   Actually field name “sql” contains more than 100 values so we need to exclude all of them and capture only those events which has mentioned above 4 values. We tried below configurations in transforms.conf and props.conf files but none of them are giving desired results. Can some please help us with the correct settings as this is the first time we are working transforms.conf and props.conf settings.     transforms.conf  as below;   [setnull] SOURCE_KEY=_raw REGEX = * DEST_KEY = queue FORMAT = nullQueue [setqueue] SOURCE_KEY=_raw REGEX = sql=log(out! |in success! |in failure!) DEST_KEY = queue FORMAT = indexQueue   Props.conf as below;   [sourcetype::sls_sourcetype] TRANSFORMS-set= setnull,setparsing
Hi All, I need to collect "Thread Dump" and "Heap Dump" of the application into Splunk.  What are all the possibilities to achieve it?
Splunk app inspect reports the "check_for_supported_tls" failure with the description as  -  If you are using requests.post to talk to your own infra with non-public PKI, make sure you bundle your o... See more...
Splunk app inspect reports the "check_for_supported_tls" failure with the description as  -  If you are using requests.post to talk to your own infra with non-public PKI, make sure you bundle your own CA certs as part of your app and pass the path into requests.post as an arg. I am using verify: false in the request.post() method and getting the above error in the app inspect tool.
I have Splunk UF 7.0.3 that I want to send logs from to Splunk Cloud.  However, the UF doesn't support httpout so I am using an intermediate forwarder.  Can someone give me the input and output f... See more...
I have Splunk UF 7.0.3 that I want to send logs from to Splunk Cloud.  However, the UF doesn't support httpout so I am using an intermediate forwarder.  Can someone give me the input and output files of the intermediate forwarder and the output file to send to the intermediate forwarder?   
Good day experts, to manage the ingestion volume, I need apply truncation to a source that sends pretty high volume of data. However, we do not wish to truncate all events from this source, only cert... See more...
Good day experts, to manage the ingestion volume, I need apply truncation to a source that sends pretty high volume of data. However, we do not wish to truncate all events from this source, only certain events which are less critical. I tried to override sourcetype in transforms.conf with a regex that matches less critical events and applying truncate in props.conf to the custom sourcetype but failed. Only then I recalled that the transforms and props apply at indexing time. The transforms worked as expected to change the sourcetype but it did not truncate.  Does anyone faced similar use case or can share a way to manage this? Thank you in advance.
Searches Delayed Root Cause(s): The percentage of non high priority searches delayed (22%) over the last 24 hours is very high and exceeded the red thresholds (20%) on this Splunk instance. To... See more...
Searches Delayed Root Cause(s): The percentage of non high priority searches delayed (22%) over the last 24 hours is very high and exceeded the red thresholds (20%) on this Splunk instance. Total Searches that were part of this percentage=147626. Total delayed Searches=32735 How we can permanently solve this issue
Hi all, I am trying to find a way to use Rest API like search endpoint for splunk but my problem is my company use Okta application to login to splunk. So I can't have username or password for splun... See more...
Hi all, I am trying to find a way to use Rest API like search endpoint for splunk but my problem is my company use Okta application to login to splunk. So I can't have username or password for splunk so is there any ways to bypass this auth ?
I have a JSON file I am trying to search for a specific value - EventType=GoodMail - and then pull the values from another field - {}.MessageCount. I have the following search to pull back the EventT... See more...
I have a JSON file I am trying to search for a specific value - EventType=GoodMail - and then pull the values from another field - {}.MessageCount. I have the following search to pull back the EventType of just GoodMail:   index="mail_reports" | spath | mvexpand "{}.EventType" | search {}.EventType=GoodMail   But if I add this on to the end of the search:   | stats values "{}.MessageCount"   I get - "Error in 'stats' command: The argument '{}.MessageCount' is invalid." How do I modify the search to pull back the values for {}.MessageCount'? Thx
splunk receives 2 different stream data sets on a single hec (json). set 1 has call records set 2 has call status/disposition so if i want call detail information from set 1 on calls that m... See more...
splunk receives 2 different stream data sets on a single hec (json). set 1 has call records set 2 has call status/disposition so if i want call detail information from set 1 on calls that meet criteria in set 2, i have to join the records. i used to use 'join' but read several articles about other ways and came across this method which I like, but really feels so slow/heavy     index="myindex" resource="somefilter" | stats values(*) as * by guid | search column="terminated"     because we have millions of rows to search from and i'm just looking for a few. I tried adding my search criteria higher up, like this:     index="myindex" resource="somefilter" column="terminated" | stats values(*) as * by guid     but then the other columns come back empty (I presume because it filtered them out, so nothing to join). So looking for another/faster/better way to: 1. get data from set 2 with criteria X 2. bring back matches of that data from set 1. Always many thanks for the education!
(Novice) Is there a way to identify uniquely the information that is being sent to a single indexer from multiple forwarders in separate environments?  Each environment is a mirror of the other.  The... See more...
(Novice) Is there a way to identify uniquely the information that is being sent to a single indexer from multiple forwarders in separate environments?  Each environment is a mirror of the other.  They all have the same IPs and hostnames; including the forwarders. Maybe there is a tag the forwarder can apply or something that makes them unique?