Activity Feed
- Posted Why is my LINE_BREAKER is not working? on Getting Data In. 09-19-2023 02:15 PM
- Tagged Why is my LINE_BREAKER is not working? on Getting Data In. 09-19-2023 02:15 PM
- Tagged Why is my LINE_BREAKER is not working? on Getting Data In. 09-19-2023 02:15 PM
- Tagged Why is my LINE_BREAKER is not working? on Getting Data In. 09-19-2023 02:15 PM
- Posted Having trouble with SSL Certificates while trying to integrate Trellix with Splunk on Installation. 07-13-2023 02:46 PM
- Tagged Having trouble with SSL Certificates while trying to integrate Trellix with Splunk on Installation. 07-13-2023 02:46 PM
- Tagged Having trouble with SSL Certificates while trying to integrate Trellix with Splunk on Installation. 07-13-2023 02:46 PM
- Posted Re: How can I compare two values obtained from a search and a lookup table? on Splunk Search. 11-14-2022 02:55 PM
- Posted How can I compare two values obtained from a search and a lookup table? on Splunk Search. 11-14-2022 02:44 PM
- Tagged How can I compare two values obtained from a search and a lookup table? on Splunk Search. 11-14-2022 02:44 PM
- Tagged How can I compare two values obtained from a search and a lookup table? on Splunk Search. 11-14-2022 02:44 PM
- Posted Help joining two different sourcetypes from the same index that both have a field with the same value but different name on Splunk Search. 11-10-2022 09:10 AM
- Tagged Help joining two different sourcetypes from the same index that both have a field with the same value but different name on Splunk Search. 11-10-2022 09:10 AM
- Tagged Help joining two different sourcetypes from the same index that both have a field with the same value but different name on Splunk Search. 11-10-2022 09:10 AM
- Karma Re: This delim in my query is not working, how could I possibly solve this problem? for venky1544. 07-21-2022 09:41 AM
- Posted This delim in my query is not working, how could I possibly solve this problem? on Splunk Search. 05-31-2022 01:42 PM
- Tagged This delim in my query is not working, how could I possibly solve this problem? on Splunk Search. 05-31-2022 01:42 PM
- Tagged This delim in my query is not working, how could I possibly solve this problem? on Splunk Search. 05-31-2022 01:42 PM
- Tagged This delim in my query is not working, how could I possibly solve this problem? on Splunk Search. 05-31-2022 01:42 PM
- Karma Re: Help on query to filter incoming traffic to a firewall for PickleRick. 05-27-2022 01:22 PM
Topics I've Started
Subject | Karma | Author | Latest Post |
---|---|---|---|
0 | |||
0 | |||
0 | |||
0 | |||
0 | |||
0 |
09-19-2023
02:15 PM
Hello, recently I've added a new firewall as a source to the splunk solution at work but I can't figure why my LINE_BREAKER thing is not working. I've deployed the thing both at the heavy forwarder and the indexers but still can't make it work. Logs are coming in like this: Sep 19 16:02:28 host_ip date=2023-09-19 time=16:02:27 devname="fw_name_1" devid="fortigate_id_1" eventtime=1695157347491321753 tz="-0500" logid="0001000014" type="traffic" subtype="local" level="notice" vd="vdom1" srcip=xx.xx.xx.xx srcport=3465 srcintf="wan_1" srcintfrole="undefined" dstip=xx.xx.xx.xx dstport=443 dstintf="client" dstintfrole="undefined" srccountry="Netherlands" dstcountry="Peru" sessionid=1290227282 proto=6 action="close" policyid=0 policytype="local-in-policy" service="HTTPS" trandisp="noop" app="HTTPS" duration=9 sentbyte=1277 rcvdbyte=8294 sentpkt=11 rcvdpkt=12 appcat="unscanned"
Sep 19 16:02:28 host_ip date=2023-09-19 time=16:02:28 devname="fw_name_1" devid="fortigate_id_1" eventtime=1695157347381319603 tz="-0500" logid="0000000013" type="traffic" subtype="forward" level="notice" vd="vdom2" srcip=143.137.146.130 srcport=33550 srcintf="wan_2" srcintfrole="undefined" dstip=xx.xx.xx.xx dstport=443 dstintf="3050" dstintfrole="lan" srccountry="Peru" dstcountry="United States" sessionid=1290232934 proto=6 action="close" policyid=24 policytype="policy" poluuid="12c55036-3d5b-51ee-9360-c36a034ab600" policyname="INTERNET_VDOM" service="HTTPS" trandisp="noop" duration=2 sentbyte=2370 rcvdbyte=5826 sentpkt=12 rcvdpkt=11 appcat="unscanned"
Sep 19 16:02:28 host_ip date=2023-09-19 time=16:02:28 devname="fw_name_1" devid="fortigate_id_1" eventtime=1695157347443046437 tz="-0500" logid="0000000020" type="traffic" subtype="forward" level="notice" vd="vdom2" srcip=xx.xx.xx.xx srcport=52777 srcintf="wan_2" srcintfrole="undefined" dstip=xx.xx.xx.xx dstport=443 dstintf="3050" dstintfrole="lan" srccountry="Peru" dstcountry="Peru" sessionid=1289825875 proto=6 action="accept" policyid=24 policytype="policy" poluuid="12c55036-3d5b-51ee-9360-c36a034ab600" policyname="INTERNET_VDOM" service="HTTPS" trandisp="noop" duration=500 sentbyte=1517 rcvdbyte=1172 sentpkt=8 rcvdpkt=7 appcat="unscanned" sentdelta=1517 rcvddelta=1172
Sep 19 16:02:28 host_ip date=2023-09-19 time=16:02:28 devname="fw_name_1" devid="fortigate_id_1" eventtime=1695157347481317830 tz="-0500" logid="0000000013" type="traffic" subtype="forward" level="notice" vd="vdom2" srcip=xx.xx.xx.xx srcport=18191 srcintf="3050" srcintfrole="lan" dstip=xx.xx.xx.xx dstport=443 dstintf="wan_2" dstintfrole="undefined" srccountry="Peru" dstcountry="Peru" sessionid=1290224387 proto=6 action="timeout" policyid=21 policytype="policy" poluuid="ab285ae0-3d5a-51ee-dce1-3f4aec1e32dc" policyname="PUBLICACION_VDOM" service="HTTPS" trandisp="noop" duration=13 sentbyte=180 rcvdbyte=0 sentpkt=3 rcvdpkt=0 appcat="unscanned"
Sep 19 16:02:28 host_ip date=2023-09-19 time=16:02:27 devname="fw_name_2" devid="fortigate_id_2" eventtime=1695157346792901761 tz="-0500" logid="0000000013" type="traffic" subtype="forward" level="notice" vd="vdom3" srcip=xx.xx.xx.xx srcport=47767 srcintf="3006" srcintfrole="lan" dstip=xx.xx.xx.xx dstport=8580 dstintf="wan_2" dstintfrole="undefined" srccountry="United States" dstcountry="Peru" sessionid=3499129086 proto=6 action="timeout" policyid=18 policytype="policy" poluuid="9cba23b2-3dfa-51ee-847f-49862ff000c0" policyname="PUBLICACION_VDOM" service="tcp/8580" trandisp="noop" duration=10 sentbyte=40 rcvdbyte=0 sentpkt=1 rcvdpkt=0 appcat="unscanned" srchwvendor="Cisco" devtype="Router" mastersrcmac="xxxxxxxxxxxxxxx" srcmac="xxxxxxxxxxxxxxx" srcserver=0 And the configuration I added into props.conf is the following: [host::host_ip]
SHOULD_LINEMERGE = false
LINE_BREAKER = ([\r\n]+)(?=\w{3}\s+\d{1,2}\s\d{2}\:\d{2}\:\d{2})
TIME_PREFIX = eventtime=
TIME_FORMAT = %b %d %H:%M:%S The format is similar to the configuration applied to similar sources so I can't figure out why it isn't working. I'd appreciate any kind of insight you guys could bring. Thanks in advance!
... View more
Labels
- Labels:
-
props.conf
07-13-2023
02:46 PM
Hello everyone, I hope you guys can help me figure this out since I've been thinking a lot about it since yesterday. I'm by no means an expert in Splunk. However, I've been tasked with integrating Trellix EDR log files into Splunk. I found an app in the splunkbase site (https://splunkbase.splunk.com/app/6480) that could be the answer to my task. I installed the app in the heavy forwarder as I have done before while integrating Rapid 7 logs and followed the brief guide provided by the author of the guide. However, this is where the problems start. After I configured input settings, I didn't recieve a single log file. I checked the logs and found out that the problem had something to do with SSL Certificates. ERROR pid=7210 tid=MainThread file=base_modinput.py:log_error:309 | Error in input_module_trellix_edr_input.get_threats() - line 127 : HTTPSConnectionPool(host='api.soc.us-east-1.mcafee.com', port=443): Max retries exceeded with url: /ft/api/v2/ft/threats?sort=-lastDetected&filter=%7B%22severities%22:%20%5B%22s0%22,%20%22s1%22,%20%22s2%22,%20%22s3%22,%20%22s4%22,%20%22s5%22%5D,%20%22scoreRange%22:%20%5B30%5D%7D&from=1686690509938&limit=10000&skip=0 (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1106)'))) I immediately started googling that error and found out that it was probably an outdated SSL Certificate. Thing is, when I connected to the heavy forwarder through SSH and tried to update the python SSL Certificates through pip I found out that you couldn't do that in a splunk server. I found a workaround that implied that I could somehow disable the SSL check, I spend hours looking at the most suspicious .py files but couldn't find where that check was made (It also didn't help that I know next to nothing of python). I also tried playing with the settings of the input, trying out different regions, etc, but it was all for naught. Ultimately, I started thinking that it was a problem caused by outdated SSL Certificates hardcoded in the app (don't really know if that's possible) I ended up deciding to contact support. It was at that time that I noticed that this app wasn't supported by Splunk and that I had to contact the developer of the app if I wanted any kind of support. I did some research on mister "Martin Ohl" and found out that he no longers works at Trellix (no wonder why the app never had an update). I went to Trellix's support page and couldn't find a support email so I started dwelling in their support and FAQ web. I could not find any single post or hint about a possible integration with any SIEM not just Splunk. So I thought that posting my case in the Splunk Community Forums was my best bet. I'd appreciate any hint, insight or even an anecdote about a similar case. If anyone has managed to integrate Trellix into Splunk It'd be a lot of help if you could share your experience. Or even if someone knows how to deal with the SSL Certificates thing. I'll be uploading a pdf file with more detail about the error log I recieved. Thanks in advance.
... View more
Labels
- Labels:
-
add-on
-
app
-
heavy forwarder
11-14-2022
02:55 PM
Thanks a lot, just as I posted this I learned about appendcols via a post I found in stackoverflow. Anyways, thanks a lot for the recomendations, I'll take them in consideration to make the query cleaner.
... View more
11-14-2022
02:44 PM
Hello, For the past week I've been working in a way to run some queries for a report about vulnerability findings. I have made a lookup table for the vulnerability details and I call that to the main query to do the work. However, I'm currently having a bit of trouble trying to figure out the scheduled query to run in order to update the vulnerabilities details lookup table. Since Rapid 7 sometimes doesn't import well their vulnerability definitions to splunk (i.e: there are 270000 lines but for some reason, some day only 12000 gets imported into splunk) I wanted to make some validations before deciding to run the outputlookup to update the table. To do this I had deviced this so far: index=rapid7 sourcetype="rapid7:insightvm:vulnerability_definition"
| dedup id
| lookup soc_vulnerabilities.csv vulnerability_id OUTPUT vulnerability_id title description
| stats count as today
| append
[| inputlookup soc_vulnerabilities.csv
| stats count as yesterday]
| eval prov=yesterday
| eval conditional=if(today>=yesterday,1,0)
| table conditional, today, yesterday, prov As you can see, all I'm doing is validating if the amount of lines being imported to splunk are the same or greater than the current amount of lines stored in the lookup table. Thing is, the eval with the conditional isnt working because both total values are being shown as if they were unrelated, which they kind of are. The result table is as follows: conditional today yesterday prov 0 238732 0 238732 238732 What I want is to compare both today and yesterday values in order to determine if the lookup table should or should not be updated. I've been looking at the documentation for a way to make it work and also checked some other posts here in the forums but I haven't found a similar case. I hope it's not because it is impossible, nevertheless, I'd appreciate if you guys could help me to figure this out or should I try to solve this problem from other perspective. Additional info: For those who have worked with this logs before, vulnerability_id field in that sourcetype doesn't exists, so we created it via CLI in the normalization options thing. Thanks in advance.
... View more
11-10-2022
09:10 AM
Hello everybody,
I'm trying to join two different sourcetypes from the same index that both have a field with the same value but different name.
i.e:
sourcetype 1 - field name=x - value=z | sourcetype 2 - field name=y - value=z
I've tried this two queries but had no success at joining these two:
index=rapid7 sourcetype="rapid7:insightvm:asset:vulnerability_finding" finding_status=new
| eval date=strftime(now(),"%m-%d")
| eval date_first=substr(first_found,6,5)
| where date=date_first
| join type=outer left=L right=R where L.vulnerability_id=R.id
[ search index=rapid7 sourcetype="rapid7:insightvm:vulnerability_definition" ] index=rapid7 sourcetype="rapid7:insightvm:asset:vulnerability_finding" OR sourcetype="rapid7:insightvm:vulnerability_definition"
| eval id=vulnerability_id
| transaction id
As you can see, I didn't even tried with the transaction one because I haven't finished to understand how it works.
The main issue I have is that I want to work with all values so I can build a table or a stats command that displays the most recent vulnerabilities found by the InsightVM dataset, however, I only get the values from the left search. Whenever I add a stats or a table command to the query using the join command I get empty values in my table.
i.e:
| table L.asset_hostname, R.title, R.description, L.solution_fix
I have already manually tested to see if the values from the different fields are the same and they are, I'd appreciate if someone would be kind enought to shed some light onto this and help me understand what am I doing wrong.
Thanks in advance.
... View more
Labels
- Labels:
-
join
-
transaction
05-31-2022
01:42 PM
Hello everyone. I'm fairly new to Splunk, I've recently joined a job as a security analist in a SOC where I get to use this cool tool. This question is kind of a continuation to my previos post: https://community.splunk.com/t5/Splunk-Search/Help-on-query-to-filter-incoming-traffic-to-a-firewall/m-p/599607/highlight/true#M208701 I had to make a query to do two things: First, look for any potential policy with any ports enabled. Second, find out which of these policies were allowing or teardowning request coming from public IP addresses. For this I came up with this query which does the work imo: index="sourcedb" sourcetype=fgt_traffic host="external_firewall_ip" action!=blocked
| eventstats dc(dstport) as different_ports by policyid
| where different_ports>=5
| eval source_ip=if(cidrmatch("10.0.0.0/8", src) OR cidrmatch("192.168.0.0/16", src) OR cidrmatch("172.16.0.0/12", src),"private","public")
| where source_ip="public"
| eval policy=if(isnull(policyname),policyid,policyid+" - "+policyname)
| eval port_list=if(proto=6,"tcp",if(proto=17,"udp","proto"+proto))+"/"+dstport | dedup port_list
| table source policy different_ports port_list
| mvcombine delim=", " port_list However, the problem I'm having is that the port list is being shown like if it was one big list, like this: 1 2 3 4 5 I'd like for it to show like this: 1, 2, 3, 4, 5 I've also tried replacing the table command with a stats delim=", " value(port_list) but I've had no success. I'd appreciate if you could give me some insight on how could I solve this, I had in mind trying mvjoin but had no clue on how to approach it. Thanks in advance.
... View more
05-27-2022
10:57 AM
Thanks for the reply, I tried that but if I put the stats* command at the end, then the column "source_ip" from the eval command won't show up in my statistics tab
... View more
05-27-2022
10:34 AM
Hello.
Recently I've joined a new company that is using splunk as their siem and this past month I've being trying to learn a bit about the tool since I'm completely new to it. I was assigned as an exercise to work out a query to basically do this 2 things:
identify potential policies with all ports enabled
identify which of these policies are recieving petitions from public IP addresses
So far I've come up with this query:
index="sourcedb" sourcetype=fgt_traffic host="<external firewall ip>" action!=blocked
| eventstats dc(dest_port) as ports by policyid
| stats count by policyid ports
| eval source_ip=if(cidrmatch("10.0.0.0/8", src) OR cidrmatch("192.168.0.0/16", src) OR cidrmatch("172.16.0.0/12", src),"private","public")
| where source_ip="public"
Basically, the main problem I'm having and can't seem to find a reasonable solution is that I've already managed to find out how to filter private IP addresses from the results but I feel like my eventstats sentence is not working properly, mainly because I'm counting all the distinct destination ports but not by the policyid.
I'd be really grateful if you guys could give me a hint or an advice about how I can aproach this case.
Thanks in advance
... View more