All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@richgalloway , We can do the Event Action Extract Fields instead on creating the props/transforms right ?   Thanks
Hi,  I'll explain myself better with an example: I have the following values in radio (input): Name -> Value MB ->1024/1024 GB ->1024/1024/1024 (...) with the $token_name$ I can use the select... See more...
Hi,  I'll explain myself better with an example: I have the following values in radio (input): Name -> Value MB ->1024/1024 GB ->1024/1024/1024 (...) with the $token_name$ I can use the selected value in my search, but I would also like to use the name/label of the selected size to use in the chart legend. Is it possible to do this in Dashboard Studio? thanks in advance for your help  
Hello, I want to get Rspamd logs into Splunk with every info available. The best I could do with Rspamd config yields to this: 2023-11-03 13:02:24 #56502(rspamd_proxy) <7fcfc8>; lua; [string "retur... See more...
Hello, I want to get Rspamd logs into Splunk with every info available. The best I could do with Rspamd config yields to this: 2023-11-03 13:02:24 #56502(rspamd_proxy) <7fcfc8>; lua; [string "return function (t...:4: METATEST {"qid":"8BC8C2F741","user":"unknown","ip":"188.68.A.B","header_from":["foo bar via somelist <somelist@baz.org>"],"header_to":["list <somelist@baz.org>"],"header_subject":["proper subject"],"header_date":["Fri, 3 Nov 2023 08:00:43 -0400 (EDT)"],"scan_time":2457,"rcpt":["me@myself.net"],"size":6412,"score":-5.217652,"subject":"proper subject","action":"no action","message_id":"4SMK7v2HQTzJrP1@spike.bar.org","fuzzy":[],"rspamd_server":"rack.myself.net","from":"somelist-bounces@baz.org","symbols":[{"score":-0.500000,"group":"composite","groups":["composite"],"name":"RCVD_DKIM_ARC_DNSWL_MED"},{"score":0,"group":"headers","groups":["headers"],"name":"FROM_HAS_DN"},{"score":0,"group":"headers","options":["somelist@baz.org","somelist-bounces@baz.org"],"groups":["headers"],"name":"FROM_NEQ_ENVFROM"},{"score":-0.010000,"group":"headers","groups":["headers"],"name":"HAS_LIST_UNSUB"},{"score":0,"group":"headers","options":["somelist@baz.org"],"groups":["headers"],"name":"PREVIOUSLY_DELIVERED"},{"score":-1,"group":"abusix","options":["188.68.A.B:from"],"groups":["abusix","rbl"],"name":"RWL_AMI_LASTHOP"},{"score":-0.100000,"group":"mime_types","options":["text/plain"],"groups":["mime_types"],"name":"MIME_GOOD"},{"score":-0.200000,"group":"headers","options":["mailman"],"groups":["headers"],"name":"MAILLIST"},{"score":1,"group":"headers","groups":["headers"],"name":"SUBJECT_ENDS_QUESTION"},{"score":-0.200000,"group":"policies","options":["+ip4:188.68.A.B"],"groups":["policies","spf"],"name":"R_SPF_ALLOW"},{"score":-1,"group":"policies","options":["list.sys4.de:s=2023032101:i=1"],"groups":["policies","arc"],"name":"ARC_ALLOW"},{"score":0,"group":"ungrouped","options":["asn:19xxxx, ipnet:188.68.A.B/20, country:XY"],"groups":[],"name":"ASN"},{"score":0.100000,"group":"headers","groups":["headers"],"name":"RCVD_NO_TLS_LAST"},{"score":0,"group":"headers","groups":["headers","composite"],"name":"FORGED_RECIPIENTS_MAILLIST"},{"score":0,"group":"policies","options":["baz.org:+","bar.org:-"],"groups":["policies","dkim"],"name":"DKIM_TRACE"},{"score":0,"group":"headers","groups":["headers"],"name":"REPLYTO_DOM_NEQ_FROM_DOM"},{"score":0,"group":"policies","options":["bar.org:s=dktest"],"groups":["policies","dkim"],"name":"R_DKIM_REJECT"},{"score":-2.407652,"group":"statistics","options":["97.28%"],"groups":["statistics"],"name":"BAYES_HAM"},{"score":0,"group":"headers","groups":["headers"],"name":"TO_DN_ALL"},{"score":0,"group":"composite","groups":["composite"],"name":"DKIM_MIXED"},{"score":-0.200000,"group":"policies","options":["baz.org:s=20230217-rsa"],"groups":["policies","dkim"],"name":"R_DKIM_ALLOW"},{"score":0,"group":"headers","options":["3"],"groups":["headers"],"name":"RCVD_COUNT_THREE"},{"score":-0.600000,"group":"rbl","options":["188.68.A.B:from","188.68.A.B:received","168.100.A.B:received"],"groups":["rbl","dnswl"],"name":"RCVD_IN_DNSWL_MED"},{"score":-0.100000,"group":"rbl","options":["188.68.A.B:from"],"groups":["rbl","mailspike"],"name":"RWL_MAILSPIKE_GOOD"},{"score":0,"group":"policies","options":["baz.org"],"groups":["policies","dmarc"],"name":"DMARC_NA"},{"score":0,"group":"headers","options":["1"],"groups":["headers"],"name":"RCPT_COUNT_ONE"},{"score":0,"group":"mime_types","options":["0:+"],"groups":["mime_types"],"name":"MIME_TRACE"},{"score":0,"group":"headers","groups":["headers","composite"],"name":"FORGED_SENDER_MAILLIST"},{"score":0,"group":"headers","groups":["headers"],"name":"TO_EQ_FROM"},{"score":0,"group":"headers","options":["foo@bar.org"],"groups":["headers"],"name":"HAS_REPLYTO"}]}   Currently I’m extracting JSON with a props.conf & a transforms.conf: props.conf [rspamd] KV_MODE = json TRANSFORMS-json_extract_rspamd = json_extract_rspamd transforms.conf [json_extract_rspamd] SOURCE_KEY = _raw DEST_KEY = _raw LOOKAHEAD = 10000 #REGEX = ^([^{]+)({.+})$ REGEX = ^(\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d) (#\d+)\(([^)]+)\) ([^;]+); lua[^{]+{(.+})$ FORMAT = {"date":"$1","ida":"$2","process":"$3","idb":"$4",$5 CLONE_SOURCETYPE = _json I end up with this in splunk:                                     From here, I have 2 problems. 1st problem: contrary to native JSON (from my Amavis logs for example), Splunk does not extract nor process basic stats about fields unless I’m explicitly extract them… That’s quite a pain. Is there a way / config setting to instruct Splunk to automagically extract every fields? 2nd problem: this JSON is crap. Every object in "symbols[]" looks like this: It’s almost unuseable as it prevent me from linking the name of the symbol to its score and to its options. Is there a parsing option / function I could use to reliably transform this into something I can work with? A good result could be turning { group: abusix groups: [ abusix rbl ] name: RWL_AMI_LASTHOP options: [ A.B.C.D:from ] score: -1 } into RWL_AMI_LASTHOP: [ group: abusix groups: [ abusix rbl ] name: RWL_AMI_LASTHOP options: [ A.B.C.D:from ] score: -1 ]   I’m open to suggestions, I’ve been working for years with the great JSON logs of Amavis (perfect parsing and usability). This problem is new to me…
You can do it also based on source, but you must remember precedence! [<spec>] * This stanza enables properties for a given <spec>. * A props.conf file can contain multiple stanzas for any number of... See more...
You can do it also based on source, but you must remember precedence! [<spec>] * This stanza enables properties for a given <spec>. * A props.conf file can contain multiple stanzas for any number of different <spec>. * Follow this stanza name with any number of the following setting/value pairs, as appropriate for what you want to do. * If you do not set a setting for a given <spec>, the default is used. <spec> can be: 1. <sourcetype>, the source type of an event. 2. host::<host>, where <host> is the host, or host-matching pattern, for an event. 3. source::<source>, where <source> is the source, or source-matching pattern, for an event. 4. rule::<rulename>, where <rulename> is a unique name of a source type classification rule. 5. delayedrule::<rulename>, where <rulename> is a unique name of a delayed source type classification rule. These are only considered as a last resort before generating a new source type based on the source seen. **[<spec>] stanza precedence:** For settings that are specified in multiple categories of matching [<spec>] stanzas, [host::<host>] settings override [<sourcetype>] settings. Additionally, [source::<source>] settings override both [host::<host>] and [<sourcetype>] settings.   And of course restart is needed after changing those.  Also "splunk btool props list --debug" is excellent tool to check that you have correct configuration in use.
Hi All, I want to create an SPL query that first returns data by matching the destination IP address from Palo Alto logs. Then, according to the destination IP, it will resolve the actual destinatio... See more...
Hi All, I want to create an SPL query that first returns data by matching the destination IP address from Palo Alto logs. Then, according to the destination IP, it will resolve the actual destination hostname from Symantec logs and Windows Event logs in separate fields. I was able to match the destination IP (dest_ip) from Palo Alto logs with Symantec logs and return the hostname (if available) from it. However, I am struggling to do the same by joining Windows logs to return the values, which should be equal to the hostname in Symantec logs. Can someone kindly assist me in fixing this code to retrieve the expected results?       index=*-palo threat="SMB: User Password Brute Force Attempt(40004)" src=* dest_port=445 | eval dest_ip=tostring(dest) | join type=left dest_ip [ search index=*-sep device_ip=* | eval dest_ip=tostring(device_ip) | stats count by dest_ip user_name device_name ] | eval dest_ip=tostring(dest) | join type=left dest_ip [ search index="*wineventlog" src_ip=* | eval dest_ip=tostring(src_ip) | eval username=tostring(user) | stats count by dest_ip username ComputerName ] | table future_use3 src_ip dest_ip dest_port user device_name user_name rule threat repeat_count action ComputerName username | sort src_ip | rename future_use3 AS "Date/Time" src_ip AS "Source IP" dest_ip AS "Destination IP" user AS "Palo Detected User" user_name AS "Symantec Detected User @ Destination" device_name AS "Symantec Destination Node" rule AS "Firewall Rule" threat as "Threat Detected" action as "Action" repeat_count AS "Repeated Times"       @eve
Hi @anandhalagaras1 , as @isoutamo said, You have to put these conf files on Indexers or (if present) on Heavy Forwarders. Ciao. Giuseppe
@isoutamo yes i have placed the props in HF. So i tried with source format as well and that too didnt worked. So is the source format is correct? Can we do masking based on host in props? If yes k... See more...
@isoutamo yes i have placed the props in HF. So i tried with source format as well and that too didnt worked. So is the source format is correct? Can we do masking based on host in props? If yes kindly let me know.    
* If CLONE_SOURCETYPE is used as part of a transform, the transform creates a modified duplicate event for all events that the transform is applied to via normal props.conf rules. I point your a... See more...
* If CLONE_SOURCETYPE is used as part of a transform, the transform creates a modified duplicate event for all events that the transform is applied to via normal props.conf rules. I point your attention to "for all events that the transform is applied to via normal props.conf rules". So either match your events by host/source matching if possible or clone all events and then filter out those you don't need (not the prettiest idea, I know). And yes, @isoutamo 's remark about _json is a good point.
Hi when you are using CLONE_SOURCETYPE it always clone all events. If you don't need all events, then you must filter those away and format only needed events as json. You should never use _json as... See more...
Hi when you are using CLONE_SOURCETYPE it always clone all events. If you don't need all events, then you must filter those away and format only needed events as json. You should never use _json as your real sourcetype. It's there just for reference. You should always define your own sourcetype names event those are e.g. jsow format! r. Ismo
I found the solution and wanted to post it here.  I added Device name which then allowed me to use IONS (User ID), to get the total count.  My new challenge is to get these stats on a per day basis i... See more...
I found the solution and wanted to post it here.  I added Device name which then allowed me to use IONS (User ID), to get the total count.  My new challenge is to get these stats on a per day basis in a line chart.  Perhaps someone can give me some ideas.  | stats count by Device IONS | where count >= 10 | appendpipe [|stats count as IONS | eval Device="Total"]
Have you put those configurations to HF? As it's the 1st full splunk instance, it will do those actions not indexer.
@gcusello  I have tried the first solution but it didn't masked the value. I have forwarded the UF logs to the HF server and then to indexers. And I have tried with the sourcetype as well as with s... See more...
@gcusello  I have tried the first solution but it didn't masked the value. I have forwarded the UF logs to the HF server and then to indexers. And I have tried with the sourcetype as well as with source but it didn't worked.   Props.conf: sourcetype: [abc] SEDCMD-mask = s/securityToken=[^ ]*/securityToken=********/g source: [source::C:\\abc\\def\\xyz\\*\\*.log] SEDCMD-mask = s/securityToken=[^ ]*/securityToken=********/g      
Hello, I'm currently trying to convert some mixed-text events into JSON. The log file is made of some pure text log lines and some other lines that start with plain text and end with some JSON. I ha... See more...
Hello, I'm currently trying to convert some mixed-text events into JSON. The log file is made of some pure text log lines and some other lines that start with plain text and end with some JSON. I have created a transforms.conf rule to extract the JSON and to clone the event into _json sourcetype: [json_extract_rspamd] SOURCE_KEY = _raw DEST_KEY = _raw LOOKAHEAD = 10000 #REGEX = ^([^{]+)({.+})$ REGEX = ^(\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d) (#\d+)\(([^)]+)\) ([^;]+); lua[^{]+{(.+})$ FORMAT = {"date":"$1","ida":"$2","process":"$3","idb":"$4",$5 CLONE_SOURCETYPE = _json   This is working but unfortunately it will also clone every events from that log file. is there a way to trigger the CLONE_SOURCETYPE only when the REGEX is matched?
On which instance(s) did you install the TA? With what, specifically, do you need help?
Hi @leooooowang, you could use the searches you're using in the present Advanced XML page to recreate this page in Simple XML. Start from the panels and then add one by one the inputs. It isn't co... See more...
Hi @leooooowang, you could use the searches you're using in the present Advanced XML page to recreate this page in Simple XML. Start from the panels and then add one by one the inputs. It isn't complicated only it's a long and annoying job. Ciao. Giuseppe
Was it possible?
Hi @Roy_9, as I said, removing this app, you'll receive anooying messages. I hint to make it not visible. If the issue are the emails that this app send, you can disable the checks. Ciao. Giuseppe
So, what events are returned from the first part of your search with http_method="POST"?
The dataset I am using is "Boss of the Sock Version 1" - "botsv1" Index="botsv1" My assessment question is as follows: Use a "wild card" search to find all passwords used against the destination I... See more...
The dataset I am using is "Boss of the Sock Version 1" - "botsv1" Index="botsv1" My assessment question is as follows: Use a "wild card" search to find all passwords used against the destination IP. As a hint, you will need to use a specific http_method= as the attack was against a web server. You will also need to pipe the result into a search that uses the form_data field to search for user passwords within the form_data. What is the SPL statement used?   Warning! My SPL may be all over the place but here it is: Index="botsv1" dest_ip="192.168.250.70"  "http_method=POST form_data"*" source:"stream:http" | table form_data, src_ip, password I am very new here! Your help is much appreciated
What fields do you already have extracted? By "filter" do you mean filter in or filter out i.e. do you want to keep the events with T[A], keep only those events with T[A] or remove them altogether?