All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Tried below regex to blacklist OR ignore 4688 event codes from the *.exe coming from the splunk forwarder path/directory But not working, it's considering 4688 from splunk and non-splunk path OR n... See more...
Tried below regex to blacklist OR ignore 4688 event codes from the *.exe coming from the splunk forwarder path/directory But not working, it's considering 4688 from splunk and non-splunk path OR not sending events from both splunk and non-splunk path. Looking for a regex to be added as blacklist to ignore 4688 coming from *.exe files part of splunk universal forwarder path/directory   blacklist = EventCode="4688" Message="New Process Name: (?i)(?:[C-F]:\Program Files\Splunk(?:UniversalForwarder)?\bin\(?:btool|splunkd|splunk|splunk-(?:MonitorNoHandle|admon|netmon|perfmon|powershell|regmon|winevtlog|winhostinfo|winprintmon|wmi)).exe)" blacklist = EventCode="4688" Message="New Process Name: (?:[a-zA-Z]:\\Program Files\\Splunk(?:\\UniversalForwarder)?\\bin\\.+\.exe)" blacklist = EventCode="4688" Message="New Process Name: (?:[a-zA-Z]:\\\\Program Files\\\\Splunk(?:\\\\UniversalForwarder)?\\\\bin\\\\.+\\.exe)" blacklist = EventCode="4688" Message="New Process Name: C:\\\\Program Files\\\\SplunkUniversalForwarder\\\\bin\\\\" blacklist = EventCode="4688" Message="New Process Name: C:\\Program Files\\SplunkUniversalForwarder\\bin\\" blacklist = EventCode="4688" Message="New Process Name: (?i)[A-Z]:\\Program Files\\Splunk(?:\\UniversalForwarder)?\\bin\\.*\\.exe)" blacklist = EventCode="4688" Message="New Process Name:\s*[A-Z]:\\Program Files\\Splunk(?:\\UniversalForwarder)?\\bin\\.+\\.exe)" blacklist = EventCode="4688" Message="New Process Name:\s*[A-Z]:\\\\Program Files\\\\SplunkUniversalForwarder\\\\bin\\\\.*\\.exe"
Hi @awilly162 , as also @PickleRick and @isoutamo said, forwarding indexed data isn't a good idea because you pay twice your license. if you need to recover your old data you have two options: ext... See more...
Hi @awilly162 , as also @PickleRick and @isoutamo said, forwarding indexed data isn't a good idea because you pay twice your license. if you need to recover your old data you have two options: extract all the data and ingest in the new indexers, but it's a very long job because you have to separately extract data for index, sourcetype and host, and anyway, you pay twice your license. The second option is an off line option: if you have different indexes names, you can manually copy (with stopped Splunk) the indexes from the old Indexers to the new one, remembering to copy also the indexes.conf file. In this way you migrated all the data without paying twice license but you'll have two indexes until the old ones will end the retention period. There's a third option: engage the Splunk Professional Servces, but it is a bit expensive. Ciao. Giuseppe
Hi @Priya70 , are you sure about the regex? probably it isn't correct, so you don't extract the appname field and the stats doesn't give ant result. I can help you, if you can share some sample of... See more...
Hi @Priya70 , are you sure about the regex? probably it isn't correct, so you don't extract the appname field and the stats doesn't give ant result. I can help you, if you can share some sample of your logs. Only a general hint: you are working of windows logs, that are multiline, try to add the multiline option to your regex: | rex field=Message "(?ms)Product: (?<appname>[^\-]+)" Ciao. giuseppe
The key to do this is to organize fields from events into an easy-to-access format after the lookup table.  Traditionally, Splunkers use mvjoin-split action.  But for a highly variable use case like ... See more...
The key to do this is to organize fields from events into an easy-to-access format after the lookup table.  Traditionally, Splunkers use mvjoin-split action.  But for a highly variable use case like this, it is almost impossible.  You want a structured data representation.  Something like, oh, I know, JSON. If you use Splunk 8.1 or later, I recommend this:   | tojson output_field=hash | lookup cases.csv id | foreach Field1 Field2 [eval output = mvappend(output, '<<FIELD>>' . "=" . json_extract(hash, '<<FIELD>>'))] | eval output = "id" . id . " Summary " . mvjoin(output, " ") | table output hash Field1 Field2   This should work with any number of cases.  To illustrate the point, this comes from your sample data and sample lookup: output hash Field1 Field2 id1 Summary src_ip=2.2.2.2 dest_ip=1.1.1.1 {"dest_ip":"1.1.1.1","id":1,"src_ip":"2.2.2.2"} src_ip dest_ip id2 Summary user=bob domain=microsoft {"domain":"microsoft","id":2,"user":"bob"} user domain id3 Summary country=usa {"city":"seattle","country":"usa","id":3} country   id4 Summary company=cisco product=splunk {"company":"cisco","id":4,"product":"splunk"} company product (Interestingly, if you are pre-8.1, you can replace json_extract with spath - the function, not command, and the search still works in this case.) Here is an emulation for you to play with and compare with real data.   | makeresults | eval data = split("E1: id=1 , dest_ip=1.1.1.1, src_ip=2.2.2.2,..... E2: id=2, user=bob, domain=microsoft E3: id=3 country=usa, city=seattle E4: id=4 company=cisco, product=splunk", " ") | mvexpand data | rename data as _raw | extract | fields - _* ``` data emulation above ```    
Not sure what you see, but this works in my instance. Data emulation is | makeresults | eval _raw = "PARAMETER VALUE ASDF 6 XCV 34 ERT 1 TDED 14 GHT 3 GHB 2 ... See more...
Not sure what you see, but this works in my instance. Data emulation is | makeresults | eval _raw = "PARAMETER VALUE ASDF 6 XCV 34 ERT 1 TDED 14 GHT 3 GHB 2 BNHJ 17 QWE 17 DDD 9 YYY 8 KLO 7 POL 2 YUO 82 TRYU 2" | multikv | fields - _* linecount
I'm looking to export notable events from the Incident Review dashboard in Splunk Enterprise Security to a CSV/Excel file. I need to include the following details: Notable Name (Rule Name) Notable... See more...
I'm looking to export notable events from the Incident Review dashboard in Splunk Enterprise Security to a CSV/Excel file. I need to include the following details: Notable Name (Rule Name) Notable Triggered Time Time Assigned for Investigation Conclusion (e.g., True Positive (TP), False Positive (FP), Benign True Positive (BTP)) Open/Closed Status What would be the best SPL query or method to extract this information? Also, is there a way to automate this export on a scheduled basis?   Currently using the SPL query: `notable` | eval original_time=strftime(orig_time,"%c") | eval reviewing_time=strftime(review_time,"%c") | table search_name, comment, disposition_label, original_time, reviewing_time, owner, search_name, reviewer, status, status_description, status_label, urgency, username and I'm getting results. However, I'm not getting an ID to locate and go through an individual notable if I wanted to. How can I search for a specific notable? Is there a tracking number for a notable? I'd like to include it in my table as well.
This is for an Ingest Processor pipeline.  I have hundreds of fields I want to redact as events pass through the pipeline .  The regex for each includes the fieldname to look for and lots of complex ... See more...
This is for an Ingest Processor pipeline.  I have hundreds of fields I want to redact as events pass through the pipeline .  The regex for each includes the fieldname to look for and lots of complex regex for the various formatting options. So rather than repeat that complex formatting lots of times I was thinking of a loop to loop through a list of the field names, assemble the regex, then process it. Or maybe just a command for each fielname to search for that calls a function that assembles the regex and executes the rex command. But I am starting to thing that SPL2 cant do this.  I better go do some more ready. thanks
Thanks _ I will have a look into this.
Hi Giuseppe, Thanks for showing an interest. I will try to include everything with this example: The code is: $pipeline = | from $source | rex "postCode:(?P<postCode1>\\d+)" | eval regexstrA= "p... See more...
Hi Giuseppe, Thanks for showing an interest. I will try to include everything with this example: The code is: $pipeline = | from $source | rex "postCode:(?P<postCode1>\\d+)" | eval regexstrA= "postCode:(?P<postCodeA>\\d+)" | eval regexstrB= "postCode:(?P<postCodeB>\\\\d+)" //| rex regexstrA //| rex regexstrB | into $destination; the sample data is  blah blah postCode:4548 blah blah when I run it you can see the field value extracts properly and the fields in lines 3 and 4 also get created and you can see their contents: But if I run with line 5 uncommented I get this error: Error in 'rex' command: The regex 'regexstrA' does not extract anything. It should specify at least one named group. Format: (?<name>...). and a similar error if I uncomment line 6. Any ideas why? Thanks  
Thanks Sainag, Just confusing doco then as this page: https://docs.splunk.com/Documentation/SCS/current/SearchReference/rexcommandexample which is all about SPL2  in section 2 has examples without... See more...
Thanks Sainag, Just confusing doco then as this page: https://docs.splunk.com/Documentation/SCS/current/SearchReference/rexcommandexample which is all about SPL2  in section 2 has examples without the P after the ?  but section 3 has the P all good, thanks
PowerConnect has native support for SAP CPQ https://help.powerconnect.io/powerconnectdocumentation/sap-cpq
Thanks, it’s exciting what I needed
@gcusello @yeahnah  I want to display in the similar tabular way what showed but not giving  on specific json as taking as makeresult... I have the event flowing in  two format which i shared earli... See more...
@gcusello @yeahnah  I want to display in the similar tabular way what showed but not giving  on specific json as taking as makeresult... I have the event flowing in  two format which i shared earlier in the splunk.can you help on this
  Traditional SPL: (?<name>pattern) SPL2 (Edge/Ingest Processor): (?P<name>pattern)  
An image is worth a thousand words https://docs.bmc.com/xwiki/bin/view/Standalone/Product-Support/productinfo/BMC-AMI-Defender-App-for-Splunk/
Finally, the app was a little deaper on the BMC website. The link on splunkbase doesn't work. BMC AMI Defender App for Splunk - BMC Documentation You need to signin on the BMC website and you nee... See more...
Finally, the app was a little deaper on the BMC website. The link on splunkbase doesn't work. BMC AMI Defender App for Splunk - BMC Documentation You need to signin on the BMC website and you need a valid subscription to acces the download. Usually, your BMC admin in your organization should have that access.
Hi, does someone knows where we can download the app for the BMC AMI Defender logs. Splunk base provides a link to a BMC 404 page not found error. I looked around on the BMC site when logged in, but... See more...
Hi, does someone knows where we can download the app for the BMC AMI Defender logs. Splunk base provides a link to a BMC 404 page not found error. I looked around on the BMC site when logged in, but couldn't find the app. BMC AMI Defender | Splunkbase Is there another place we can grab this APP? The BMC documentation of the APP is here: BMC AMI Defender App for Splunk 2.9 - BMC Documentation Thanks!
Thanks for linking that article, I haven't seen it and it's got some handy tips.  1) Yes, this works. 3) Unfortunately, that seems to be the tradeoff based on what you're trying to do. When you fil... See more...
Thanks for linking that article, I haven't seen it and it's got some handy tips.  1) Yes, this works. 3) Unfortunately, that seems to be the tradeoff based on what you're trying to do. When you filter (rest/artifact) you're look for any artifacts which match your search results. When you request object detail, (rest/artifact/5/name) you're restricting your results to artifact 5 specifically.  Based on your question, I'm guessing you're going to want something along these lines: /rest/artifact?_filter_cef__destinationAddress={SNow CI}&page_size=0 I don't think you'll be able to get of looping through your results one way or another. 
Either I'm missing something here or you're overly complicating it. Why there is a separate json1 and json2? Sticking with your mockup data | makeresults count=5 | streamstats count as a | eval _t... See more...
Either I'm missing something here or you're overly complicating it. Why there is a separate json1 and json2? Sticking with your mockup data | makeresults count=5 | streamstats count as a | eval _time = _time + (60*a) | eval json1="{\"id\":1,\"attrib_A\":\"A1\"}#{\"id\":2,\"attrib_A\":\"A2\"}#{\"id\":3,\"attrib_A\":\"A3\"}#{\"id\":4,\"attrib_A\":\"A4\"}#{\"id\":5,\"attrib_A\":\"A5\"}", json2="{\"id\":2,\"attrib_B\":\"B2\"}#{\"id\":3,\"attrib_B\":\"B3\"}#{\"id\":4,\"attrib_B\":\"B4\"}#{\"id\":6,\"attrib_B\":\"B6\"}" | makemv delim="#" json1 | makemv delim="#" json2 | eval json=mvappend(json1,json2) | fields _time json | mvexpand json | spath input=json | fields - json | stats values(attrib_*) as * by _time id You can of course add fillnull at the end.
This worked for me.  My pie chart was just one of a few panels in a dashboard's row.  When I would open that panel search on its own, even fields with small values would be shown because it was a pie... See more...
This worked for me.  My pie chart was just one of a few panels in a dashboard's row.  When I would open that panel search on its own, even fields with small values would be shown because it was a pie chart in its own "row".  Moving my pie chart into its own dashboard row and making it larger then showed small values in the visualization.  Sadly, I have not found any other changes to the visualization settings, search parameters/settings, that would force a small pie chart to show all fields that have small values.  Quite annoying.