Hello @nina, There are a few ways - - If you are planning to showcase some use cases as a part of Project - Splunk Security Essentials (https://splunkbase.splunk.com/app/3435) does have some built...
See more...
Hello @nina, There are a few ways - - If you are planning to showcase some use cases as a part of Project - Splunk Security Essentials (https://splunkbase.splunk.com/app/3435) does have some built-in datasets. For example for Sample Brute Force Attack Detection - https://github.com/splunk/botsv3 does have a number of sample datasets for multiple sourcetypes - You can use EventGen (https://splunkbase.splunk.com/app/1924) to generate "more" events based on existing event formats. Please accept the solution and hit Karma, if this helps!
Hi @gcusello , Thanks for replying. I am using a standalone search head. Would like to move to a standalone search head and not a search head cluster. Is that the same process for migrating apps t...
See more...
Hi @gcusello , Thanks for replying. I am using a standalone search head. Would like to move to a standalone search head and not a search head cluster. Is that the same process for migrating apps to standalone search head?
Hello @grotti, If I understand the issue correctly, you are getting the expected results, but not for 12 hours. Is that right? If so, you can use "| addinfo" command as below - | inputlookup appe...
See more...
Hello @grotti, If I understand the issue correctly, you are getting the expected results, but not for 12 hours. Is that right? If so, you can use "| addinfo" command as below - | inputlookup append=T incident_review_lookup | addinfo | where time>=info_min_time | rename user as reviewer | `get_realname(owner)` | `get_realname(reviewer)` | eval nullstatus=if(isnull(status),"true","false") | `get_reviewstatuses` | eval status=if((isnull(status) OR isnull(status_label)) AND nullstatus=="false",0,status) | eval status_label=if(isnull(status_label) AND nullstatus=="false","Unassigned",status_label) | eval status_description=if(isnull(status_description) AND nullstatus=="false","unknown",status_description) | eval _time=time | fields - nullstatus It would give you the results based on whatever time range you are selecting from time range picker. Please accept the solution and hit Karma, if this helps!
Hi @gcusello I can explain with some screenshots the problem: The logs are related with an Antivirus (policies, detected viruses and so on), in the first image you can see the file was created at ...
See more...
Hi @gcusello I can explain with some screenshots the problem: The logs are related with an Antivirus (policies, detected viruses and so on), in the first image you can see the file was created at 00:35:00 , this is an Antivirus Scan This is the content of the file: ....but as you can see timestamp shows 06:35 (That's why I added the TZ option in the props.conf) Finally this is an image of the Splunk search, the _time column is aligned with the timestamp with the log content The register was supposed to arrive at 00:35, but was entered at 06:35. (6 hours after the scan) The hour is set at GMT-6. I tried to look the AV settings to set the time at GMT-6 but it does not have that option.
Hi does the new export feature also allow to export a table panel to PDF that includes the entrire data set, in case the data overflows the size of single table page and pagination kicks into play h...
See more...
Hi does the new export feature also allow to export a table panel to PDF that includes the entrire data set, in case the data overflows the size of single table page and pagination kicks into play hidding remaining events in other pages? Thank you Wojtek
Hello everyone, I'm working on a project ''Splunk Enterprise: An organization's go-to in detecting cyber threats'' please how/where can I get datasets and logs that I will use for my project.
@gcusello I added the INDEXED_EXTRACTIONS=csv, then I restarted the splunk daemon. [my_custom_sourcetype]
CHARSET=UTF-8
INDEXED_EXTRACTIONS=csv
TIME_FORMAT=%Y-%m-%dT%H:%M:%S,
TIME_PREFIX=^
LINE_B...
See more...
@gcusello I added the INDEXED_EXTRACTIONS=csv, then I restarted the splunk daemon. [my_custom_sourcetype]
CHARSET=UTF-8
INDEXED_EXTRACTIONS=csv
TIME_FORMAT=%Y-%m-%dT%H:%M:%S,
TIME_PREFIX=^
LINE_BREAKER=([\r\n]+)
NO_BINARY_CHECK=true
SHOULD_LINEMERGE=true
TZ=America/Mexico_City
disabled=false But I continue receiving logs from 6 hours ago. Copying the last log received in Splunk 9/30/23
6:35:02.000 AM
2023-09-30T06:35:02,Time of completion: 00:35:02 ***** 0 sec (00:00:00)
host = ******* source = /var/log/****/*****log.****.txtsourcetype = my_custom_sourcetype as you can see the last log have received at 06:35:02am -> but was created at 00:35:02 of my current time in Mexico City. At the moment no more logs showed in splunk But now I realized the logs come split for some reason.
Hi I'm currently working on obtaining Windows Filtering Platform event logs to identify the user responsible for running an application. My goal is to enhance firewall rules by considering both the ...
See more...
Hi I'm currently working on obtaining Windows Filtering Platform event logs to identify the user responsible for running an application. My goal is to enhance firewall rules by considering both the application and the specific user. To achieve this, I've set up a system to send all logs to Splunk, which is already operational. However, I've encountered an issue with WFP event logs not displaying the authorized principal user who executed the application. This absence of user information makes it challenging to determine who used what application before I can further refine the firewall rules. If you have any insights or suggestions on how to address this issue, I would greatly appreciate your assistance. I can readily access various details such as destination, source, port, application, and protocol, but the missing username is a crucial piece of information I need. Thank you for any guidance you can provide.
Hi, Please try below: | stats max(A) as ACnt, max(B) as BCnt, max(C) as CCnt by month, id
| stats sum(ACnt) as ACnt, sum(BCnt) as BCnt, sum(CCnt) as CCnt by month
Hi @ucorral, it's really strange, because this seems to be a csv file and you don't have INDEXED_EXTRACTIOS = csv in props.conf of the Universal Forwarder. Ciao. Giuseppe
Hi @Utkc137 , you could try stats with eval: <your_search>
| stats
count(eval(A=1)) AS A_id_cnt
count(eval(B=1)) AS B_id_cnt
count(eval(C=1)) AS C_id_cnt
BY month if it doesn't run...
See more...
Hi @Utkc137 , you could try stats with eval: <your_search>
| stats
count(eval(A=1)) AS A_id_cnt
count(eval(B=1)) AS B_id_cnt
count(eval(C=1)) AS C_id_cnt
BY month if it doesn't run, please try also: <your_search>
| stats
count(eval(A="1")) AS A_id_cnt
count(eval(B="1")) AS B_id_cnt
count(eval(C="1")) AS C_id_cnt
BY month Ciao. Giuseppe