I have same problem with Amazon Linux 2023 (Kernel 6.1) the reason is Splunk Enterprise not support Linux kernel 6 and I notice Splunk have no problem with Amazon Linux 2023 (Kernel 5). The rea...
See more...
I have same problem with Amazon Linux 2023 (Kernel 6.1) the reason is Splunk Enterprise not support Linux kernel 6 and I notice Splunk have no problem with Amazon Linux 2023 (Kernel 5). The reason is Splunk does not support the OS (Linux Kernel) that you used.
Greetings, I found some useful savedsearches under SA-AccessProtection / DA-ESS-AccessProtection, which I am interested in using. However, I'd like to understand these use-cases before making them l...
See more...
Greetings, I found some useful savedsearches under SA-AccessProtection / DA-ESS-AccessProtection, which I am interested in using. However, I'd like to understand these use-cases before making them live. Are these apps and their content documented somewhere? So far, I have not had any luck. Thanks!
I have a dashboard that a specific team uses. Today, they asked about why one of the panels was broken. Looking into it, we were receiving this error from the search: Error in 'fit' command: E...
See more...
I have a dashboard that a specific team uses. Today, they asked about why one of the panels was broken. Looking into it, we were receiving this error from the search: Error in 'fit' command: Error while fitting "StateSpaceForecast" model: timestamps not continuous: at least 33 missing rows, the earliest between "2024-01-20 07:00:00" and "2024-01-20 09:00:00", the latest between "2024-10-02 06:00:00" and "2024-10-02 06:00:01" That seemed pretty straight forward, I thought we might be missing some timestamp values. This is the query we are running: |inputlookup gslb_query_last505h.csv | fit StateSpaceForecast "numRequests" holdback=24 forecast_k=48 conf_interval=90 output_metadata=true period=120 Looking into the CSV file itself, I went to look for missing values under the numRequests column. We have values for each hour going back for almost a year. The timestamps mentioned in the error look like: Looking at that SS now, There is an hour missing there. The timestamp for 08:00. That may be the cause. How would I go about efficiently finding the 33 missing values? Each value missing would be in-between any two hours. Will I have to go through and find skipped hours among 8k results in the CSV file? Thanks for any help.
Json is a structured format so the order of fields should not matter for the recipient. After all you will be addressing the fields by their names. Unless you're manipulating that json on the receivi...
See more...
Json is a structured format so the order of fields should not matter for the recipient. After all you will be addressing the fields by their names. Unless you're manipulating that json on the receiving end using something not fit for json processing. In that case maybe json is not the best format choice.
Hi @hazem , it depends on your Use cases: what do you want to monitor? use a log level that gives you the data you need, I cannot say to you from outside what's the best log level. In general, exc...
See more...
Hi @hazem , it depends on your Use cases: what do you want to monitor? use a log level that gives you the data you need, I cannot say to you from outside what's the best log level. In general, except some situations, I'd avoid the debug level and I'd use Alert level, but, as I said, it depends on your Use Cases. In addition, this is a question for Palo Alto experts not Splunk because they know the contents of each log level. Ciao. Giuseppe
Thank you @gcusello for your reply. Our customer has asked me about the recommended log level that should be sent, for example, from Palo Alto to Splunk. Do you have any answer for this?"
Hi @hazem , for many products there are some guides for integration with Splunk developed by the same third party vendor and the only way is to search on Google (e.g for Sophos you can see at https:...
See more...
Hi @hazem , for many products there are some guides for integration with Splunk developed by the same third party vendor and the only way is to search on Google (e.g for Sophos you can see at https://partnernews.sophos.com/it-it/2021/05/prodotti/splunk-integration-for-sophos-firewall/). Anyway, searching "Splunk Getting data in" you have a guide to Datas ingestion in Splunk: https://docs.splunk.com/Documentation/Splunk/latest/Data/WhatSplunkcanmonitor . At least, the first approach to data ingestion should be identify the technology to ingest and searching the related Add-On in apps.splunk.com, that usually guides users to integration. Ciao. Giuseppe
Hi, the website behind the link STEP | Splunk Training and Enablement Platform: Course and Class Details shows me only classes that cost 750 USD. Do I have to apply any code during the booking proces...
See more...
Hi, the website behind the link STEP | Splunk Training and Enablement Platform: Course and Class Details shows me only classes that cost 750 USD. Do I have to apply any code during the booking process? Furthermore I'm getting following error message when I try to book the class.
Is there any guide on how to configure security products to send their logs to Splunk or what are the recommended logs that should be sent, like the DSM guide in QRadar?
I think it is not possible to change the order of the fields with tojson command. But try to create a json object with eval and json_object function maybe that accomplish your goal. ...your sear...
See more...
I think it is not possible to change the order of the fields with tojson command. But try to create a json object with eval and json_object function maybe that accomplish your goal. ...your search... |eval my_json_object=("timestamp",timestamp,"Subject",Subject,"emailBody",emailBody,"operation",operation)
Hi @Alex_Rus , please try this regex in the blacklist option, (?ms)EventCode: 4624.*Account Name: MY_Account Name.*Source Network Address: MY_Source Network Address otherwise, please try it in the...
See more...
Hi @Alex_Rus , please try this regex in the blacklist option, (?ms)EventCode: 4624.*Account Name: MY_Account Name.*Source Network Address: MY_Source Network Address otherwise, please try it in the transpose.conf (on Indexers) to filter events. Ciao. Giuseppe
Log Name: Security EventCode: 4624 EventType: 0 ComputerName: MY_ComputerName SourceName: Microsoft Windows security auditing. Type: Information RecordNumber: 93370261535 Keywords: Audit Succe...
See more...
Log Name: Security EventCode: 4624 EventType: 0 ComputerName: MY_ComputerName SourceName: Microsoft Windows security auditing. Type: Information RecordNumber: 93370261535 Keywords: Audit Success TaskCategory: Logon OpCode: Info Message: An account was successfully logged on. Subject: Security ID: S-1-0-0 Account Name: - Account Domain: - Logon ID: 0x0 Logon Information: Logon Type: 3 Restricted Admin Mode: No Virtual Account: No Elevated Token: Yes Impersonation Level: Impersonation New Logon: Security ID: S-1-5-21-877741627-2216037581-1742749770-81699 Account Name: MY_Account Name Account Domain: MY_Account Domain Logon ID: 0x2153A91CB Linked Logon ID: 0x0 Network Account Name: - Network Account Domain: - Logon GUID: {-} Process Information: Process ID: 0x0 Process Name: - Network Information: Workstation Name: - Source Network Address: MY_Source Network Address Source Port: Port Detailed Authentication Information: Logon Process: Kerberos Authentication Package: Kerberos Transited Services: - Package Name (NTLM only): - Key Length: 0 This event is generated when a logon session is created. The subject fields indicate the account on the local system which requested the logon. This is most commonly a service such as the Server service, or a local process such as Winlogon.exe or Services.exe. The logon type field indicates the kind of logon that occurred. The most common types are 2 (interactive) and 3 (network). The New Logon fields indicate the account for whom the new logon was created, i.e., the account that was logged on. The network fields indicate where a remote logon request originated. Workstation name is not always available and may be left blank in some cases. The impersonation level field indicates the extent to which a process in the logon session can impersonate. The authentication information fields provide detailed information about this specific logon request. Logon GUID is a unique identifier that can be used to correlate this event with a KDC event. Transited services indicate which intermediate services have participated in this logon request. Package name indicates which sub-protocol was used among the NTLM protocols. Key length indicates the length of the generated session key. This will be 0 if no session key was requested.
Hi @Alex_Rus , you have to find a regex to filter your events, you can test your regex using the regex command. then put the regex in inputs.conf: blacklist = key=regex if you want more help, ple...
See more...
Hi @Alex_Rus , you have to find a regex to filter your events, you can test your regex using the regex command. then put the regex in inputs.conf: blacklist = key=regex if you want more help, please share a sample of the logs to filter. Ciao. Giuseppe