We are trying to ingest a STIX file into the Threat Intelligence Management, the STIX parses, but does not find anything of interest in the file. the _internal index has the message 'status="No obse...
See more...
We are trying to ingest a STIX file into the Threat Intelligence Management, the STIX parses, but does not find anything of interest in the file. the _internal index has the message 'status="No observables or indicators found in file"' The STIX file has the format below (which from what I can tell is a valid format, containing indicators {
"more": false,
"objects": [
{
"confidence": "70",
"created": "2023-09-08T00:02:39.000Z",
"description": "xxxxxxxxx",
"id": "xxxxxxx",
"modified": "2023-09-08T00:02:39.000Z",
"name": "xxxxxxx",
"pattern": "[ipv4-addr:value = '101.38.159.17']",
"spec_version": "2.1",
"type": "indicator",
"valid_from": "2023-09-08T00:02:39.000Z",
"valid_until": "2025-11-07T00:02:39.000Z"
},
...... Has anyone had any success with STIX files and be able to share the basic format of what worked for them? Or anyone have anything other to suggest? Many thanks Simon Splunk Enterprise Security
Hi I've seen many recent changes on SOAR 6.3 regarding prompts, but I still don't see a way to define the allowed choices list as a parameter while creating a prompt block from the GUI. Many times ...
See more...
Hi I've seen many recent changes on SOAR 6.3 regarding prompts, but I still don't see a way to define the allowed choices list as a parameter while creating a prompt block from the GUI. Many times the options that are available to the user are dynamic, so hard-coding the choices list isn't practical for the user, is prone to get out of date and force playbook redeployments. The only way I see so far is by using code blocks or by adding custom code to the prompt blocks (and losing the GUI handling in the process). Is there any way I'm missing to get the question choices from a datapath or a custom list?
Splunk has good write-ups on this at https://lantern.splunk.com/Splunk_Platform/Product_Tips/Administration/Securing_the_Splunk_platform_with_TLS and https://docs.splunk.com/Documentation/Splunk/9.3....
See more...
Splunk has good write-ups on this at https://lantern.splunk.com/Splunk_Platform/Product_Tips/Administration/Securing_the_Splunk_platform_with_TLS and https://docs.splunk.com/Documentation/Splunk/9.3.1/Security/AboutsecuringyourSplunkconfigurationwithSSL
Splunk is _not_ an active monitoring solution. That's what you use - for example - rancid or some commercial tools for. But if you get logs from such tool (or have audit logs from your appliances tel...
See more...
Splunk is _not_ an active monitoring solution. That's what you use - for example - rancid or some commercial tools for. But if you get logs from such tool (or have audit logs from your appliances telling you that change happened), you can search from that data. But it will depend on what data you have.
So I've got a list containing multiple strings, depending on these strings I want to run 1 or more actions using a filter. When I use the 'in' filter to check if a certain string is in the list the m...
See more...
So I've got a list containing multiple strings, depending on these strings I want to run 1 or more actions using a filter. When I use the 'in' filter to check if a certain string is in the list the matching condition is not met. Example input = ['block_ioc', 'reset_password'] Filter block: I can successfully use the 'in' condition in a decision block, just not a filter block. Any ideas?
This doesn't trigger the alert either. My original alert (with traces.count) was triggered once during my tests, when I had 3 traces with errors in a short time period, but then it wasn't triggered a...
See more...
This doesn't trigger the alert either. My original alert (with traces.count) was triggered once during my tests, when I had 3 traces with errors in a short time period, but then it wasn't triggered anymore. Is there maybe a better way to create an alert for such single events in Splunk? I think, the "static threshold" should be rather used for continuous metrics like CPU usage. But I didn't find any other option so far.
i've looked at similar search online and have come up with this
| table "Display Name"
| eval "group" = (random() % 2) +1
| stats list("Display Name") as "Display Name" by "group"
this is returni...
See more...
i've looked at similar search online and have come up with this
| table "Display Name"
| eval "group" = (random() % 2) +1
| stats list("Display Name") as "Display Name" by "group"
this is returning random names in two groups
group
display Name
1
joe blogs 5
joe blogs 2
joe blogs 6
2
joe blogs 7
joe blogs 8
joe blogs 12
Any ideas how i can set the number returning for each group? maybe using the limit function???
Ref Doc - Splunk Add-on for GCP Docs Currently, the Cloud Storage Bucket input doesn’t support pre-processing of data, such as untar/unzip/ungzip/etc. The data must be pre-processed and ready for in...
See more...
Ref Doc - Splunk Add-on for GCP Docs Currently, the Cloud Storage Bucket input doesn’t support pre-processing of data, such as untar/unzip/ungzip/etc. The data must be pre-processed and ready for ingestion in a UTF-8 parseable format
If your values are in a multi-value field, you can do something like this | eval choice=mvindex(displayName, random()%200) If the names are in separate events, you could do something like this | e...
See more...
If your values are in a multi-value field, you can do something like this | eval choice=mvindex(displayName, random()%200) If the names are in separate events, you could do something like this | eval id=random()%500
| sort 0 id
| head 5
From what you are saying and reading between the lines between the lines, I am guessing that when All is chosen, the value of the token is set to "*". When this is used in a search e.g. field=$token$...
See more...
From what you are saying and reading between the lines between the lines, I am guessing that when All is chosen, the value of the token is set to "*". When this is used in a search e.g. field=$token$, the "*" will equate to all non-null values, which is why your search is not dealing with "empty values". To get around this, you may have to change the way the token is set up and the way it is used. For example, if you change the value prefix to be <valuePrefix>field="</valuePrefix> and the value suffix to the <valueSuffix>"</valueSuffix>, then treat the choice of "All" to set an empty token, then your search can use $token$ instead of field=$token$ This is something that is easier to do in Classic/SimpleXML dashboards than Studio.
- Check in OS firewall the port is enabled. - Check correct sourcetype is configured - Try to search the data in indexer itself to verify its not a connectivity issue between search head and indexer
Hi @dikaaditsa , which index did you configured in your input, that you're using in search? did you installed the Fortinat Fortigate Add-On for Splunk (https://splunkbase.splunk.com/app/2846) to ha...
See more...
Hi @dikaaditsa , which index did you configured in your input, that you're using in search? did you installed the Fortinat Fortigate Add-On for Splunk (https://splunkbase.splunk.com/app/2846) to have a correct parsing? At least, it isn't a best practice to use Splunk to receive syslogs. The best approach is to configure a syslog receiver (e.g. rsyslog or syslog-ng) that writes logs on disk and then use Splunk to read these files. In this way, your syslog input will be active even if Splunk is down and in addition gives less overload on the system avoiding queues. Does your distributed search (you have one SH and one IDX) correctly run? in other words, are other searches correctly executed? Ciao. Giuseppe
Hi All, I already configure ingestion log from fortigate using syslog , the log send using UDP by port 514. I also setup data inputs in splunk enterprise to recieve the data from port 514. Wh...
See more...
Hi All, I already configure ingestion log from fortigate using syslog , the log send using UDP by port 514. I also setup data inputs in splunk enterprise to recieve the data from port 514. When I perform tcp dump from splunk vm , the data successfully flowing from fortigate to splunk vm, but when I search the data from splunk web, there is no data appear. Currently I ingest the data to 1 indexer, and search the data from another search head. Please give me an advise to solve my issue. Thankyou
After debugging in so many ways found out that a field im using in the query does not include empty values of the field while "All" is selected. Do you know how can i include empty values also when ...
See more...
After debugging in so many ways found out that a field im using in the query does not include empty values of the field while "All" is selected. Do you know how can i include empty values also when "All" is selected in multiselect dropdown?