All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi , How to extract the open episodes with service now incident against each episode in Splunk itsi Thanks!  
Finally we migrated away for Microsoft Azure Add-on for Splunk to Splunk Add-on for Microsoft Cloud Services. In Microsoft Azure Add-on for Splunk  Inputs conf.  it was possible to specify manually... See more...
Finally we migrated away for Microsoft Azure Add-on for Splunk to Splunk Add-on for Microsoft Cloud Services. In Microsoft Azure Add-on for Splunk  Inputs conf.  it was possible to specify manually Event Hub Sourcetype, but in Splunk Add-on for Microsoft Cloud Services  we need to choose  the value.  The problem  is that we need the values azure:ad_signin:eventhub and azure:ad_audit:eventhub  but Splunk Add-on for Microsoft Cloud Services provides only mscs:azure:eventhub.   Based on log information from Azure  there is Category field with the values (SignInLogs,AuditLogs).  And from it I can specify which is Audit log and which is Signin log and change SourceType for each of log type. On Heavy Forwarder where App is deployed  (/opt/splunk/etc/apps/Splunk_TA_microsoft-cloudservices/default)  i added the following config. But nothing changed source type stays mscs:azure:eventhub. Any ideas what I'm missing? props.conf [mscs:azure:eventhub] TRANSFORMS-rename = SignInLogs,AuditLogs transforms.conf [SignInLogs] REGEX =  SignInLogs SOURCE_KEY = field:category DEST_KEY = MetaData:Sourcetype FORMAT = sourcetype::azure:ad_signin:eventhub WRITE_META = true [AuditLogs] REGEX =  AuditLogs SOURCE_KEY = field:category DEST_KEY = MetaData:Sourcetype FORMAT = sourcetype::azure:ad_audit:eventhub WRITE_META = true    
Hi All, Is there a way in which Splunk can generate an alert when backup and restoration exercises are conducted. Any use case that can do this? Any assistance on this would be appreciated.
Hi, is there a way to use CSS to fix the font size of text in the Status Indicator?
Hi  We are using Vmware carbon black cloud app and the vmware logs are pulled from AWS s3 buckets. The index is having logs. However, the dashboards of the app when configured with same index is no... See more...
Hi  We are using Vmware carbon black cloud app and the vmware logs are pulled from AWS s3 buckets. The index is having logs. However, the dashboards of the app when configured with same index is not working. Please help remediate. Thanks  
HI, I have a scheduled alert that sends out an email every 7days. The Sys admin turned off the server for whatever reason and forgot to turn it back up. Obviously, the report didn't trigger.  Is ... See more...
HI, I have a scheduled alert that sends out an email every 7days. The Sys admin turned off the server for whatever reason and forgot to turn it back up. Obviously, the report didn't trigger.  Is it possible to get/ generate the report that was supposed to come in? Am at a loss here just finding out today. Thanks
Greetings,   I've been asked to provide log data for a specific form that has been accessed over a certain time period. As the data are going to leave our organization, I want to filter it down to ... See more...
Greetings,   I've been asked to provide log data for a specific form that has been accessed over a certain time period. As the data are going to leave our organization, I want to filter it down to only the relevant data.   I'm looking for events in which a certain html form has been accessed. I want to display events shortly before and after, that show the same user agent.   I've attempted a few things, but in the latest query I attempted to utilize the map function. I'm not sure why I receive the error "Error in 'map': Did not find value for required attribute 'useragent'. "     index=iis_logs [search "https://example.com/form.html" | eval start=_time-15 | eval stop=_time+30 | eval useragent=_cs_User_Agent | map search="search index=* cs_User_Agent=$useragent$ earliest=$start$ latest=$stop$"] Edit: When I, for example, put cs_User_Agent=*Mozilla*, there are results surrounding the relevant events. But that is not the data I am looking for.  
Hello all, Hoping someone may be able to help. I have an internal tool I have an export from in the from of a CSV that has a column named ip. I uploaded this as a inputlookup (name.csv). I verifed I ... See more...
Hello all, Hoping someone may be able to help. I have an internal tool I have an export from in the from of a CSV that has a column named ip. I uploaded this as a inputlookup (name.csv). I verifed I can see the ip information by |inputlookup (name.csv) and the rows of IP addresses show. I have a base search that returns data , and I want to see if any of the src, or dest IP's from my search match the IP addresses listed in my name.csv. I keep running into a search, that returns a few thousand events, although I can search the event between src, and dest and it shows without the lookup. Currently my search looks like this: (index=name1 OR index=name2 OR index=name3) src_ip_country=United States action=allowed | stats count by src, dest | sort count | reverse | lookup name.csv ip OUTPUT target_ip | table target_ip, src, dest   This search provides me a tabled output with src, and dest fields populated, but nothing in the "target_ip" field.  Any ideas? Thank you.
Hi, REX command rex mode=sed to remove quotation marks and numbers inside of them   OUTPUT file "19214132.IKU" copied to output directory OUTPUT file "19315133.IKU" copied to output directory... See more...
Hi, REX command rex mode=sed to remove quotation marks and numbers inside of them   OUTPUT file "19214132.IKU" copied to output directory OUTPUT file "19315133.IKU" copied to output directory OUTPUT file "19416134.IKU" copied to output directory ....   Desired result ->   OUTPUT file .IKU copied to output directory  
I am working to leverage the below query for 'Stale Account Usage' from Splunk Security Essentials Docs, which uses lookup "account_status_tracker". The  'How to Implement' guidance includes: "The ... See more...
I am working to leverage the below query for 'Stale Account Usage' from Splunk Security Essentials Docs, which uses lookup "account_status_tracker". The  'How to Implement' guidance includes: "The only step you'll need to take is to create a lookup called account_status_tracker, and have authentication data in Common Information Model format. "  From the "Add New" lookup webpage, it is not clear how I assign an appropriate "Lookup File" that will the necessary fields in CIM format. I have looked through Splunk docs and other likely resources, with no strong hits. I admit this is an area new to me.  My question is: what steps do I need to take to define this lookup, including assigning an appropriate "Lookup File"?  When I select existing authentication-related files as the "Lookup File", I receive error messages, for example:  "Cannot find the destination field 'count' in the lookup table... And leads greatly appreciated.  index=* source="*WinEventLog:Security" action=success | stats count min(_time) as earliest max(_time) as latest by user | multireport [| stats values(*) as * by user | lookup account_status_tracker user OUTPUT count as prior_count earliest as prior_earliest latest as prior_latest | where prior_latest < relative_time(now(), "-90d") | eval explanation="The last login from this user was " . (round( (earliest-prior_latest) / 3600/24, 2) ) . " days ago." | convert ctime(earliest) ctime(latest) ctime(prior_earliest) ctime(prior_latest) ] [| inputlookup append=t account_status_tracker | stats min(earliest) as earliest max(latest) as latest sum(count) as count by user | outputlookup account_status_tracker | where this_only_exists_to_update_the_lookup='so we will make sure there are no results']
Dear Splunk community: So i have the following SPL that has been running fine for the last week or so however, all of a sudden i am getting the last unwanted column (Value) which i don't expect t... See more...
Dear Splunk community: So i have the following SPL that has been running fine for the last week or so however, all of a sudden i am getting the last unwanted column (Value) which i don't expect to get. Can you please explain, what i need to modify so that i don't get the last Value column?   <my serch> | chart count by path_template, http_status_code | addtotals fieldname=total | foreach 2* 3* 4* 5* [ eval "percent_<<FIELD>>"=round(100*'<<FIELD>>'/total,2), "<<FIELD>>"=if('<<FIELD>>'=0 , '<<FIELD>>', '<<FIELD>>'." (".'percent_<<FIELD>>'."%)")] | fields - percent_* total   Here is what is see: Really appreciate your help on this! Thanks!    
Hello,    I need to create a single value panel that displays a countdown from today's date until a target date, how can I achieve this?  Right now I am using sample data and I added a field call... See more...
Hello,    I need to create a single value panel that displays a countdown from today's date until a target date, how can I achieve this?  Right now I am using sample data and I added a field called "GoLiveDate" to the data and put the target date in (which is in the future.) What I want to do is put a panel in that says something like "80 days until Go Live."  Then tomorrow it would say "79 days until Go Live" etc etc -  Is something like this possible?
I have 2 dates first_found: 2022-08-23T21:08:54.808Z last_fixed:2022-08-30T12:56:58.860Z I am trying to calculate the difference in days between (first-found - last_fixed) and dump the result i... See more...
I have 2 dates first_found: 2022-08-23T21:08:54.808Z last_fixed:2022-08-30T12:56:58.860Z I am trying to calculate the difference in days between (first-found - last_fixed) and dump the result in a new field called "remediation_days"  
I have some searches that do not appear to be enhancing properly using the asset_lookup_by_str lookup table. In this case, I'll use dvc, but it appears to be similar with the others like dest as we... See more...
I have some searches that do not appear to be enhancing properly using the asset_lookup_by_str lookup table. In this case, I'll use dvc, but it appears to be similar with the others like dest as well. I run a search, the enhancements don't seem to be happening. If I run a search using the lookup it works sometimes. I'll give a few examples Search 1: index=windows | lookup asset_lookup_by_str asset AS dvc OUTPUTNEW domain AS AAAAA  Result 1: I get AAAAA to have only 1 of our 5 domains Search 2:  index=windows host=[hostname] | lookup asset_lookup_by_str asset AS dvc OUTPUTNEW domain AS AAAAA  Result 2: AAAAA doesn't show up . Search 3:  | inputlookup asset_lookup_by_str Result 3: The lookup table appears to be filled in nicely. Including the domains missing from search 1. Search 4:  | inputlookup asset_lookup_by_str | search asset=[host] Result 4: The host searched in Search 2 is in the lookup table.  Based on this, I'd say not only should Search 2 have worked, but Search 1 should have had more results, and the automatic lookup as a whole should be working. Any ideas what could be happening? My only thought is maybe some sort of priority order that is overriding the enhancement feature. But that wouldn't explain it not working in the specific search.   
Below query, I have used and it is saving in output lookup format.   Lookupname - S1_installedtime Query - index=sentinelone |table installedAt agentComputerName agentDomain |search installedAt... See more...
Below query, I have used and it is saving in output lookup format.   Lookupname - S1_installedtime Query - index=sentinelone |table installedAt agentComputerName agentDomain |search installedAt!="Null" |dedup agentComputerName installedAt - This field is giving the installation time Now I want a query that compares with the lookup table(S1_installedtime) and gives a result if any new agentComputerName in the last 1-week.   Objective - Need a list of agentComputerName having SentinelOne installed in the last 7 days.
Hello, I have a few use cases to send data from SPLUNK to consumers in real time, and consumers have both Linux/Windows OS. Does SPLUNK has any options to do that? Or how would we do it? Any help w... See more...
Hello, I have a few use cases to send data from SPLUNK to consumers in real time, and consumers have both Linux/Windows OS. Does SPLUNK has any options to do that? Or how would we do it? Any help will be highly appreciated. Thank you so much.
Dear All, I have about 100 Splunk UFs at 7.0.1, 7.3.5, 8.1.5, 8.2.5 and 9.0.0.1 and they are NOT being managed by a Deployment Server. I need to get them all managed by a DS at v 9.0.1, so that I ca... See more...
Dear All, I have about 100 Splunk UFs at 7.0.1, 7.3.5, 8.1.5, 8.2.5 and 9.0.0.1 and they are NOT being managed by a Deployment Server. I need to get them all managed by a DS at v 9.0.1, so that I can manage my apps remotely and so that I can get around the latest DS security CVEs. What is the oldest Splunk UF that a DS 9.0.1 can manage? The latest version of the Forwarder compatibility document is not available (and it does not cover compatibility between DS and UFs, anyway). Lastly, if I were to deploy a 8.2 DS, then would I be able to control the 9.0.0.1 UF?
Hi, I'm trying to display only a value in one particular column, for instance represent one team for different status. This is what I've done so far: index=xxx | stats count by Team, status *m... See more...
Hi, I'm trying to display only a value in one particular column, for instance represent one team for different status. This is what I've done so far: index=xxx | stats count by Team, status *my expecting result is to have only one "DevOps" to represent team for different status displayed. Team        |     Status         | Count DevOps     Assigned           10                       Pending              5                       New                      2                       Resolved            1                         .......... Many thanks for any help
Dear All/Splunk Documentation Team, I am trying to get to read the Compatibility between forwarders and Splunk Enterprise indexers document in the Forwarder Manual for 9.0.1 Forwarders, but all I ge... See more...
Dear All/Splunk Documentation Team, I am trying to get to read the Compatibility between forwarders and Splunk Enterprise indexers document in the Forwarder Manual for 9.0.1 Forwarders, but all I get is the generic "The topic you've asked to see does not apply to the most recent version" when it patently DOES apply: https://docs.splunk.com/Special:SpecialLatestDoc?t=Documentation/Forwarder/latest/Forwarder/Compatibilitybetweenforwardersandindexers but the latest version available is 8.2.5: https://docs.splunk.com/Documentation/Forwarder/8.2.5/Forwarder/Compatibilitybetweenforwardersandindexers Can this document be copied/updated for 8.2.6/7/8 and 9.x, please?  
Hi All If I apply a limits.conf for subsearch - maxout and searchresults - maxresultsrow for an app im deploying, will this update to limits overwrite the default for all apps or will this configur... See more...
Hi All If I apply a limits.conf for subsearch - maxout and searchresults - maxresultsrow for an app im deploying, will this update to limits overwrite the default for all apps or will this configuration only be applicable for the app i deploy it in  Basically, is all limits going to be changed or just that one application