All Topics

Top

All Topics

After the update to v7.1 of Splunk ES Incident Review channel, when selecting events and choosing Edit Selected, it presents with the popup/overlay window, where we can change the Status (Analyzing, ... See more...
After the update to v7.1 of Splunk ES Incident Review channel, when selecting events and choosing Edit Selected, it presents with the popup/overlay window, where we can change the Status (Analyzing, Closed, etc..) and assign ourselves as the Owner. When clicking on Save Changes, the overlay window does not auto-close, and we have to manually click on the Close button. In the previous version this overlay auto-closed and the Incident Review page refreshed after clicking on Save Changes (or Save). Is there some configuration setting that will enable this once again auto-close after making the Status changes?
Hi, I'm wondering if the syslog outputs.conf feature described in the [syslog] stanza supports TLS encryption? I see no mention of it in the docs about this.
I have three fields like " field1=SGSIFASFFWR035A field2=AXAZCBDM02 fields3=ESESDFAADFSABBM00002 in above examples I want to extract field values like these; field1=FWR035A (any character after ... See more...
I have three fields like " field1=SGSIFASFFWR035A field2=AXAZCBDM02 fields3=ESESDFAADFSABBM00002 in above examples I want to extract field values like these; field1=FWR035A (any character after FW* including FW) field2=BDM02 (any character after BDM* including BDM) fields3=BBM00002 (any character after BBM* including BBM ) additionally, I want to  to use single  command to extract all three field values in one go. like "FW*|BDM"|BBM*"   I am using below rex command to extract it but it is not including FW keyword in extracted field | rex field= field1 "FW(?<AFTERTHISKEYWORD>\S+)"   if you can provide a workable solution either using rex and eval or another code, it would be appreciated.   Thanks in advance..  
Hi After configuring some reports in PCI, when I go back to Report, I get an error message: A custom JavaScript error caused an issuse loading your dashboard. See the developer console for more det... See more...
Hi After configuring some reports in PCI, when I go back to Report, I get an error message: A custom JavaScript error caused an issuse loading your dashboard. See the developer console for more detail How to fix this error??
Hello Splunkers, Help me please. I need a search to generate daily report looking for user's traffic in internal logs. I got an csv file generated daily by an external system what contains username... See more...
Hello Splunkers, Help me please. I need a search to generate daily report looking for user's traffic in internal logs. I got an csv file generated daily by an external system what contains username, and an start-end time period like this: report.csv user,start_time,end_time user1,8,16 user2,8,20 I have to insert this three field per user into my search. I am using inputlookup to catch the "user" field this way: [base search]  | search user=*[|inputlookup "report.csv" |fields user ]* | table x,y,z,user It works, shows only the user'related logs, could be one or more users i csv. The problem i canno handle yet is about the additionl fileds. I had an idea to add an extra field with "eval" cmd, but doesn't work. So how can I read rest of the data form an external csv file? thanks 
Hello dear splunk community, how can I set the colors of the bar chart in splunk dashboard studio? Example code   index=digiks sourcetype=modeas_nexonic earliest= "@d+390m-7d" latest= "@d+1830m" ... See more...
Hello dear splunk community, how can I set the colors of the bar chart in splunk dashboard studio? Example code   index=digiks sourcetype=modeas_nexonic earliest= "@d+390m-7d" latest= "@d+1830m" SUB_DIB="4711" OR SUB_DIB="0815" | dedup RECV_TIME | chart count(RECV_TIME) over shift day by SUB_DIB | rename "4711" AS A_CHART, "0815" AS B_CHART     The bar 4711 should be displayed in red color The bar 0815 shall be displayed in green color Thanks in advance!
What kind of KPI's are supported when KPI Analyzer is enabled.
We have different licenses expires by different dates .The current sourcetype data comprises of both future expiry and past expiry too. we don’t want license that are expired meaning anything that ... See more...
We have different licenses expires by different dates .The current sourcetype data comprises of both future expiry and past expiry too. we don’t want license that are expired meaning anything that is before the current day. "Expire valid To" field will have this date . Please let me know how to achieve this in SPL .      
Hi Splunk Community -- I'm trying to ensure that my cluster master is sending internal logs to the indexer. Which directory in my cluster master should I put outputs. conf? And are there other conf ... See more...
Hi Splunk Community -- I'm trying to ensure that my cluster master is sending internal logs to the indexer. Which directory in my cluster master should I put outputs. conf? And are there other conf files that should accompany my outputs.conf file?
Installed the splunk add on to push events into ServiceNow and getting this error  "snsecingestes Unable to forward notable event"  Where could I start troubleshooting this issue?
Hello Splunkers, I have two lookups which are need to join. In lookup1.csv its containing the Rule name and the technique id in the columns and in lookup2.csv it contains the technique id and the ta... See more...
Hello Splunkers, I have two lookups which are need to join. In lookup1.csv its containing the Rule name and the technique id in the columns and in lookup2.csv it contains the technique id and the tactic name in the columns. Now i have joined the two lookups and got the result. Now the problem i am facing is that if the rule is having the multiple tactics name then the output result is displaying them in the same single field (screenshot attached). But i need the result as if the rule is having two tactic names then it should display 2 times the rule name and the tactic names individually. the query i used to join the lookups is | inputlookup lookup1.csv | lookup lookup2.csv technique_id     Waiting for the response....   Thanks in advance
Hi Splunk Community, I need a Splunk Query that monitors a password change in the DC log source that was not performed by the user from log source X (DC); Rather to be performed by an automation ... See more...
Hi Splunk Community, I need a Splunk Query that monitors a password change in the DC log source that was not performed by the user from log source X (DC); Rather to be performed by an automation of the system itself log source Y (Automated Password System). I want to trigger the search only in case the password change event was received and from that point to search 10 minutes back to see if the trigger was the system itself or an actual user. Can you please assist
Hello Is it possible to calculate the storage that part of log is taking ?  I have a log file that contains a message that i want to calculate the storage it takes after getting the numbers, is ... See more...
Hello Is it possible to calculate the storage that part of log is taking ?  I have a log file that contains a message that i want to calculate the storage it takes after getting the numbers, is it possible to exclude it from index ?   Thanks
Hi, I want to fetch http error 500 from the logs using search bar, I have set index, source type and source in the query. what should I add to it to retrieve a specifc logs of 500 http error respon... See more...
Hi, I want to fetch http error 500 from the logs using search bar, I have set index, source type and source in the query. what should I add to it to retrieve a specifc logs of 500 http error response? I have tried "status=500" which is not working.
Hello Splunkers, I have used a query in the search for mitre fields extraction and after the extraction i have got the results with the query name and the technique_id. But here the problem come... See more...
Hello Splunkers, I have used a query in the search for mitre fields extraction and after the extraction i have got the results with the query name and the technique_id. But here the problem comes. Each query is having the technique_id and the sub technique_id, so i have matched the sub technique_id with the technique_id and the results is shown with the same rule name two times with the technique_id. So i want to remove  the duplicate rule name, if so i used the dedup then the rule having the other technique_id is also getting removed. I have attached the screenshot for reference... The query i used for getting the results is  | rest /services/configs/conf-analyticstories | where annotations!="" | spath input=annotations path=mitre_attack{} output=mitre_attack | eval rule_name=ltrim(title,"savedsearch://") | fields rule_name,mitre_attack | join rule_name [| rest /services/configs/conf-analyticstories | where searches!="" | eval rule_name=searches | table title,rule_name | eval rule_name=trim(rule_name,"[") | eval rule_name=trim(rule_name,"]") | eval rule_name=split(rule_name,",") | mvexpand rule_name | eval rule_name=trim(rule_name," ") | eval rule_name=trim(rule_name,"\"") ] | append [| rest services/configs/conf-savedsearches | eval rule_name=title | search action.correlationsearch.annotations="*" | spath input=action.correlationsearch.annotations path=mitre_attack{} output=mitre_attack | fields rule_name, mitre_attack] | eval technique_name = if(match(mitre_attack,"^T\d\d\d"),null(), mitre_attack) | lookup mitre_tt_lookup technique_name OUTPUT technique_id as tmp_id0 | eval tmp_id1 = if(match(mitre_attack,"^T\d\d\d"), mitre_attack, null()) | eval technique_id=coalesce(tmp_id0, tmp_id1) | where NOT isnull(technique_id) | table rule_name, technique_id | inputlookup mitre_user_rule_technique_lookup append=true | inputlookup mitre_app_rule_technique_lookup append=true | makemv tokenizer="([^\n\s]+)" technique_id | mvexpand technique_id | dedup rule_name,technique_id | join rule_name [| rest services/configs/conf-savedsearches | eval rule_name=title | eval stage= if(disabled == 1, "Disabled", "Enabled") | table rule_name, stage ] | eval subtechnique_id=if(match(technique_id,"\."),technique_id,null()) | eval technique_id=if(match(technique_id,"\."),replace(technique_id,"\.\d+",""),technique_id) |search stage=Enabled |table rule_name,technique_id     Thanks in advance....
Hello ,   we are getting "0365 splunk addon data comes after delay of 1 day " Which version of Splunk are you currently using? Answer 8.2 Which app/add-on are you using to send the O3... See more...
Hello ,   we are getting "0365 splunk addon data comes after delay of 1 day " Which version of Splunk are you currently using? Answer 8.2 Which app/add-on are you using to send the O365 data? Answer:-  Microsoft Graph Security Add-on for SplunkTA-microsoft-graph-security-add-on-for-splunk1.2.1 Thanks Lalit 
Is there a reason the minimum number of nodes for indexer clustering needs to be 3? If three units are needed because of the role of parity in the raid theory, I don't think this role is necessary b... See more...
Is there a reason the minimum number of nodes for indexer clustering needs to be 3? If three units are needed because of the role of parity in the raid theory, I don't think this role is necessary because the CM is already doing it. Therefore, I think that 2 units should also play a clustering role, but I wonder why 3 units always come out as default in most examples. Is there any other reason??
index=my_index [search is here]  | outputcsv mycsv.csv After saving the search results into mycsv.csv file,  can I access the file via search head? | inputlookup mycsv.csv   -- is not working ... See more...
index=my_index [search is here]  | outputcsv mycsv.csv After saving the search results into mycsv.csv file,  can I access the file via search head? | inputlookup mycsv.csv   -- is not working     
I want any logfile (local, or remote via a UniversalForwarder) with the filename "xyz.log" to have a sourcetype of XYZ, and get indexed in my xyz index (not the main index). What do I need to put in... See more...
I want any logfile (local, or remote via a UniversalForwarder) with the filename "xyz.log" to have a sourcetype of XYZ, and get indexed in my xyz index (not the main index). What do I need to put in props.conf? Do I also need to configure transforms.conf? I'm using Splunk Enterprise v8 on Windows. current props.conf: [source::...\\xyz.log] sourcetype = XYZ   
I'm creating a query where I want to get an id from a log in one side (first search) and in the second search I just want to bring the results that have the ids of the first search. Then I want to ... See more...
I'm creating a query where I want to get an id from a log in one side (first search) and in the second search I just want to bring the results that have the ids of the first search. Then I want to calculate the difference between them. Something like:   index=anything source=anything route1 Payload OK | rex field=_raw "\:[0-9]{4} \- (?<IDROUTE1>[0-9a-f{8}) \-" | stats count(_raw) as CROUTE1 | table IDROUTE1 | appendcols [search index=anything source=anything route2 Payload OK | rex field=_raw "\:[0-9]{4} \- (<IDROUTE2>[0-9a-f{8}) \-" | stats count(_raw) as CROUTE2 | table IDROUTE2] | where IDROUTE1=IDROUTE2 | eval TOTAL=CROUTE1-CROUTE2 | table TOTAL     what is not working is to count the events using where I guess. Searches when done separately bring me the correct results. In Events show me a correct number of events but in Statistics show me 0.