All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@isoutamo yes i have placed the props in HF. So i tried with source format as well and that too didnt worked. So is the source format is correct? Can we do masking based on host in props? If yes k... See more...
@isoutamo yes i have placed the props in HF. So i tried with source format as well and that too didnt worked. So is the source format is correct? Can we do masking based on host in props? If yes kindly let me know.    
* If CLONE_SOURCETYPE is used as part of a transform, the transform creates a modified duplicate event for all events that the transform is applied to via normal props.conf rules. I point your a... See more...
* If CLONE_SOURCETYPE is used as part of a transform, the transform creates a modified duplicate event for all events that the transform is applied to via normal props.conf rules. I point your attention to "for all events that the transform is applied to via normal props.conf rules". So either match your events by host/source matching if possible or clone all events and then filter out those you don't need (not the prettiest idea, I know). And yes, @isoutamo 's remark about _json is a good point.
Hi when you are using CLONE_SOURCETYPE it always clone all events. If you don't need all events, then you must filter those away and format only needed events as json. You should never use _json as... See more...
Hi when you are using CLONE_SOURCETYPE it always clone all events. If you don't need all events, then you must filter those away and format only needed events as json. You should never use _json as your real sourcetype. It's there just for reference. You should always define your own sourcetype names event those are e.g. jsow format! r. Ismo
I found the solution and wanted to post it here.  I added Device name which then allowed me to use IONS (User ID), to get the total count.  My new challenge is to get these stats on a per day basis i... See more...
I found the solution and wanted to post it here.  I added Device name which then allowed me to use IONS (User ID), to get the total count.  My new challenge is to get these stats on a per day basis in a line chart.  Perhaps someone can give me some ideas.  | stats count by Device IONS | where count >= 10 | appendpipe [|stats count as IONS | eval Device="Total"]
Have you put those configurations to HF? As it's the 1st full splunk instance, it will do those actions not indexer.
@gcusello  I have tried the first solution but it didn't masked the value. I have forwarded the UF logs to the HF server and then to indexers. And I have tried with the sourcetype as well as with s... See more...
@gcusello  I have tried the first solution but it didn't masked the value. I have forwarded the UF logs to the HF server and then to indexers. And I have tried with the sourcetype as well as with source but it didn't worked.   Props.conf: sourcetype: [abc] SEDCMD-mask = s/securityToken=[^ ]*/securityToken=********/g source: [source::C:\\abc\\def\\xyz\\*\\*.log] SEDCMD-mask = s/securityToken=[^ ]*/securityToken=********/g      
Hello, I'm currently trying to convert some mixed-text events into JSON. The log file is made of some pure text log lines and some other lines that start with plain text and end with some JSON. I ha... See more...
Hello, I'm currently trying to convert some mixed-text events into JSON. The log file is made of some pure text log lines and some other lines that start with plain text and end with some JSON. I have created a transforms.conf rule to extract the JSON and to clone the event into _json sourcetype: [json_extract_rspamd] SOURCE_KEY = _raw DEST_KEY = _raw LOOKAHEAD = 10000 #REGEX = ^([^{]+)({.+})$ REGEX = ^(\d\d\d\d-\d\d-\d\d \d\d:\d\d:\d\d) (#\d+)\(([^)]+)\) ([^;]+); lua[^{]+{(.+})$ FORMAT = {"date":"$1","ida":"$2","process":"$3","idb":"$4",$5 CLONE_SOURCETYPE = _json   This is working but unfortunately it will also clone every events from that log file. is there a way to trigger the CLONE_SOURCETYPE only when the REGEX is matched?
On which instance(s) did you install the TA? With what, specifically, do you need help?
Hi @leooooowang, you could use the searches you're using in the present Advanced XML page to recreate this page in Simple XML. Start from the panels and then add one by one the inputs. It isn't co... See more...
Hi @leooooowang, you could use the searches you're using in the present Advanced XML page to recreate this page in Simple XML. Start from the panels and then add one by one the inputs. It isn't complicated only it's a long and annoying job. Ciao. Giuseppe
Was it possible?
Hi @Roy_9, as I said, removing this app, you'll receive anooying messages. I hint to make it not visible. If the issue are the emails that this app send, you can disable the checks. Ciao. Giuseppe
So, what events are returned from the first part of your search with http_method="POST"?
The dataset I am using is "Boss of the Sock Version 1" - "botsv1" Index="botsv1" My assessment question is as follows: Use a "wild card" search to find all passwords used against the destination I... See more...
The dataset I am using is "Boss of the Sock Version 1" - "botsv1" Index="botsv1" My assessment question is as follows: Use a "wild card" search to find all passwords used against the destination IP. As a hint, you will need to use a specific http_method= as the attack was against a web server. You will also need to pipe the result into a search that uses the form_data field to search for user passwords within the form_data. What is the SPL statement used?   Warning! My SPL may be all over the place but here it is: Index="botsv1" dest_ip="192.168.250.70"  "http_method=POST form_data"*" source:"stream:http" | table form_data, src_ip, password I am very new here! Your help is much appreciated
What fields do you already have extracted? By "filter" do you mean filter in or filter out i.e. do you want to keep the events with T[A], keep only those events with T[A] or remove them altogether?
Thanks for responding so quickly!!! The SPL commands I have been trying is as follows: index=indexname |stats count by domain,src_ip |sort -count |stats list(domain) as Domain, list(count) as co... See more...
Thanks for responding so quickly!!! The SPL commands I have been trying is as follows: index=indexname |stats count by domain,src_ip |sort -count |stats list(domain) as Domain, list(count) as count, sum(count) as total by src_ip |sort -total | head 10 |fields - total The task i have been given is: Use the stats, count, and sort search terms to display the top ten URI's in ascending order. This is from the botsv1 dataset
Hi i have log line like this, 1-need to group by them by ID, 2- filter those transactions that has T[A]   #txn1 16:30:53:002 moduleA ID[123] 16:30:54:002 moduleA ID[123] 16:30:55:002 moduleB ... See more...
Hi i have log line like this, 1-need to group by them by ID, 2- filter those transactions that has T[A]   #txn1 16:30:53:002 moduleA ID[123] 16:30:54:002 moduleA ID[123] 16:30:55:002 moduleB ID[123]T[A] 16:30:56:002 moduleC ID[123] #txn2 16:30:57:002 moduleD ID[987] 16:30:58:002 moduleE ID[987]T[B] 16:30:59:002 moduleF ID[987] 16:30:60:002 moduleZ ID[987]   Any idea? Thanks
So, search for "exception". This will return events which have this word in. However, this might give you some false positives, so you need to be more precise about defining exactly what you consider... See more...
So, search for "exception". This will return events which have this word in. However, this might give you some false positives, so you need to be more precise about defining exactly what you consider to be an exception event. Once you have these, you can look to extract the exception type for your statistics.
https://docs.splunk.com/Documentation/Forwarder/9.1.1/Forwarder/InstallaWindowsuniversalforwarderfromaninstaller The section "about the least-privileged user"
Yeah, you probably can fiddle with groupping by the T value or binning _time to some value (I suppose not all parts of a single transaction will have the same exact timestamp - they would probably di... See more...
Yeah, you probably can fiddle with groupping by the T value or binning _time to some value (I suppose not all parts of a single transaction will have the same exact timestamp - they would probably differ by some fraction of a second or even whole seconds) so you'd have to bin the _time and then use it for groupping.
no i haven't configured it yet, but i can use some help i only installed the add on