Hi, I'd use SEDCMD in props.conf. You can find more details in the props.conf.spec. This is used for anonymization, but it should also work for your use case. If you want detailed steps to set it up, you can follow this guide: Anonymize data with a sed script. smurf
... View more
Hi, This might not be the answer you are looking for, but a better practice for your use case would be to use Summary Indexing. You would do basically the same as you do with the lookup but use an index instead. With this, you would be able to search your data as you would any other indexes. smurf
... View more
You can ingest it directly with the Splunk instance running the ES and set it up to forward logs to your indexer. You should also be able to see it in your Monitoring Console. smurf
... View more
I don't mean real-time searches but real-time schedule type. That's the type of schedule that would skip time windows, unlike continuous schedule which would continue where it left of. That's why I used longer search windows, so if a few runs are skipped, I would still query all logs from the downtime period. The name is very confusing, TBH.
... View more
Hi, As I understand it. Continuous searches are never skipped and will be run whenever Splunk is available after downtime or when it has the resources to run it. The downside is that real-time searches have higher priority, so if your pipeline is filled with real-time searches, your continuous search might never run. Or so I was told. I never had an issue with it when I used it, but our partner suggested migrating to real-time searches. After that, we used real-time searches for almost anything while specifying a larger search window with matching throttling. I suggest going through these articles as they might answer most of your questions: Prioritize concurrently scheduled reports in Splunk Web Configure the priority of scheduled reports (real-time vs. continuous scheduling) smurf
... View more
Hi, your regex seems wrong, so it does not extract anything. I recommend using something like regex101 to test your regex. Try replacing it with this (this should work with your sample event): | rex "busDt=(?<busDt>.*?),\sfileName=(?<fileName>.+?),.*collateralSum\s(?<collateralSum>[\d|\.|E]+)\sopeningBal\s(?<openingBal>[\d|\.|E]+)\sageBalTot\s(?<ageBalTot>[\d|\.|E]+)" Hope this helps. smurf
... View more
Hi, I was dealt with a similar scenario. I would use a lookup to get a list of servers. I would also add the threshold to the lookup (host, threshold) to future-proof it. Then you can append the list and do some dudup/stats magic; or start with inputlookup and join your search. smurf
... View more
Hi, During my installation (v6.0.0), I ran into errors regarding disk space, but I could bypass them by adding the --ignore-warnings parameter. Your version has the possibility, too, according to documentation. If you ignore errors, I suggest doing a --dry-run first. smurf
... View more
Hi, It seems that they implemented this feature from v5.5.0 (at least based on the documentation). Are you using the appropriate version? smurf
... View more
Hi, There are multiple ways to deploy SC4S. You should refer to the runtime part of the documentation to find what suits you best. You can view the documentation here https://splunk.github.io/splunk-connect-for-syslog/main/gettingstarted/getting-started-runtime-configuration/ smurf
... View more
Hi, From what I understand: The Phantom one is your license for the Phantom itself. The Splunk one is for the included Splunk Enterprise install. You can find it at $PHANTOM_HOME/splunk/ From the documentation: If Splunk SOAR (On-premises) is installed as a stand-alone product, it includes a version of Splunk Enterprise as the internal search engine. You can also configure Splunk SOAR (On-premises) to use an external Splunk instance for searching. A Splunk SOAR (On-premises) cluster also requires an external Splunk Enterprise instance. You can find more here https://docs.splunk.com/Documentation/SOARonprem/6.0.0/Install/ExternalSplunk smurf
... View more
No probs. Definition of a macro is the search itself. So it could look something like this: [cim_Endpoint_indexes] definition = (index=index1 OR index=index2) You can find more details in the macros.conf spec macros.conf - Splunk Documentation smurf
... View more
Hi, first, I would look if a firewall dropped anything. So search the index with firewall logs for the user's IP address and the website's IP address, most likely port 80 or 443 since it is a website. I would do the same for any other network device like IPS/IDS. Hope this helps, at least a little. smurf
... View more
Hi, you can concatenate strings together with eval. | eval src="src=" + src This would result in src field containing "src=192.168.0.1". For a few fields, this would be easy to do. smurf
... View more
Hi, I usually just rename the join field at the end of the subsearch or create the join field in the original search with eval. It could look something like this: index=firewall | fields firewall._time, firewall.src, firewall.dest_ip | eval edr.RemoteIP = firewall.dest_ip | join [ index=edr | fields edr.username, edr.username, edr.processname edr.RemoteIP ] smurf
... View more
Hi, Using REST, you can query index information. This should give you list of indexes and their data paths. If you have multiple indexers, you need to dedup the indexes, or specify on which indexer you want to run the REST command. | rest /services/data/indexes | fields title, homePath, coldPath, thawedPath | dedup title
... View more
Hi, you could add "Log Event" Adaptive response action to your correlation search, or create a new search that would match notables and the action would be "Log Event". You can specify index, sourcetype, source and host as well as message of the event.
... View more
If you are looking only for the total number of events, you could use tstats. Searching through metadata tends to be quite fast, but could still time-out. Another possibility would be using summaries. You could schedule a search to run every day/week/month to run for the specific period and have the visualization search run on the summary data. You can find more about summary indexing here: Use summary indexing for increased search efficiency - Splunk Documentation
... View more
Hi, you would need to make some changes, but you can embed scheduled reports to an HTML site (incl. Sharepoint). So you would have to create a report for each dashboard panel and recreate the dashboard layout in Sharepoint. You can click Edit at any report and select Embed. Make sure to read and understand the message it gives you.
... View more
Hi, It will not have any impact on the users own preferences. As per the documentation here https://docs.splunk.com/Documentation/Splunk/9.0.0/Admin/authorizeconf, it says "Lets a user create, edit, or remove other users.".
... View more
Hi, you can use docker start to enter a stopped container. Since you are already connected to Splunk, your container is already running and you should use docker exec. docker exec -it <container_name> bash
... View more