Apart from all that @richgalloway already mentioned, this document shows results of testing on some particular reference hardware. It's by no means a guarantee that an input will work with this perfo...
See more...
Apart from all that @richgalloway already mentioned, this document shows results of testing on some particular reference hardware. It's by no means a guarantee that an input will work with this performance. Also remember that windows eventlog inputs get the logs by calling the system using winapi whereas file input just reads the file straight from the disk (most probably using memory-mapped files since it's most effective method). And last but definitely not least - as I already pointed out - UF typically doesn't break data into events!
That document is for a specific source where the event size is well-defined. The information there cannot be generalized because the size of an "event" is unknown. I've seen event sizes range from ...
See more...
That document is for a specific source where the event size is well-defined. The information there cannot be generalized because the size of an "event" is unknown. I've seen event sizes range from <100 to >100,000 bytes so it is very difficult to produce an EPS number without knowing more about the data you wish to ingest. It's possible the documentation for other TAs provides the information you seek. Have you looked at the TAs for your data?
I know there is Splunk Add-on for AWS, but I heard there is a simpler and easier way to read the buckets directly without using that Add-on. Is that true?
Thank you @PickleRick for you concern, 1. I have tried to embed report on my website as per following this document Embed scheduled reports. But I was not able to embed Splunk data report on my webs...
See more...
Thank you @PickleRick for you concern, 1. I have tried to embed report on my website as per following this document Embed scheduled reports. But I was not able to embed Splunk data report on my website, it was showing “Report not available” and in console showing 401 Unauthorized status code, Please check this image and reply. 2. I will follow this app well Embedded Dashboards For Splunk (EDFS) 3. Sure, i will try to use backend (server-side code) service to get Splunk data securely using the REST API. Please let me explore more about REST API for backend service and where i have to request for REST API.
Unfortunately, all the fields required are present in the raw event. The tags were also produced correctly. In this (quite old) thread https://community.splunk.com/t5/Splunk-Search/Define-user-fiel...
See more...
Unfortunately, all the fields required are present in the raw event. The tags were also produced correctly. In this (quite old) thread https://community.splunk.com/t5/Splunk-Search/Define-user-field-in-Security-Essentials/m-p/312738 I've found an issue somehow connected to mine, as mine problems are also connected to the user and src_user field extraction and my AD server is also in a non-english language. Has anyone found the underlying issue?
Hello, I am trying to create a custom view (also via Xpath) from EventViewer and later insert it into Splunk via a "WinEventLog" and leveraging the Windows Addon. Can it be done using "WinEven...
See more...
Hello, I am trying to create a custom view (also via Xpath) from EventViewer and later insert it into Splunk via a "WinEventLog" and leveraging the Windows Addon. Can it be done using "WinEventLog" or some other way in inputs.conf as it is for Application/Security/System? [WinEventLog://MyCustomLog] As suggested here I tried this configuration but no logs were onboarded and it returned no error also in _internal logs. Has anyone found a custom solution for inserting these newly created custom views from the EventViewer to Splunk? Thanks
I am looking for something similar to this. https://docs.splunk.com/Documentation/WindowsAddOn/8.1.2/User/PerformancereferencefortheSplunkAdd-onforWindows
Hi @gcusello did you get answer from @woodcock regarding applying on all etc/system/local/authorize.conf search head nodes (preferably from GUI if possible) ? Thanks.
It's interesting what you write here, syslog-ng was the first one to support TCP based logging, so it definitely supports that. Dropping of UDP packets is a known issue, but there are a lot of ways ...
See more...
It's interesting what you write here, syslog-ng was the first one to support TCP based logging, so it definitely supports that. Dropping of UDP packets is a known issue, but there are a lot of ways to improve that. Here are a couple of blog posts: https://axoflow.com/syslog-over-udp-kernel-syslog-ng-tuning-avoid-losing-messages/ https://axoflow.com/detect-tcp-udp-message-drops-syslog-telemetry-pipelines/ Kafka is just a message bus, you will need a component to actually receive the messages and then hand them over to kafka. of course syslog-ng can do that: https://axoflow.com/docs/axosyslog-core/chapter-destinations/configuring-destinations-kafka-c/
I finally got it this way: transform.conf:
[my-log]
Format = $1::$2
regex = FICHERO_LOG(\d+)\s+==s+([^=\n]+)\n
MV-ADD = true props.conf
REPORT-log = my-log thank you all for your help
The whole idea is tricky. If you want to use an embedded report - that's relatively easy and safe - you can embed _scheduled_ report so they are generated using predefined configuration and you can e...
See more...
The whole idea is tricky. If you want to use an embedded report - that's relatively easy and safe - you can embed _scheduled_ report so they are generated using predefined configuration and you can embed the resulting report into your external webpage. With stuff more "dynamic" it's way more tricky because it involves a whole lot of making sure you don't accidentally create a security vulnerability for your environment and don't let the user do much more with your Splunk installation than you initially intended. There is an app https://splunkbase.splunk.com/app/4377 which is supposed to do that or at least help with that but I haven't used it and I don't know if/what limitations it has. Of course another thing would be to use REST api in your app to call Splunk servers and generate proper output which you could then present to your user but due to the security issues I mentioned before, it's something you'd rather do in your backend and only present the user with the results embedded in your web app by said backend than let the browser call the Splunk search-heads directly.
@pranay03 Are you able to resolve this issue. I am as well facing same issue. In the console log it shows it started watching, however i could not see the logs in Web portal. Please let me know...
See more...
@pranay03 Are you able to resolve this issue. I am as well facing same issue. In the console log it shows it started watching, however i could not see the logs in Web portal. Please let me know your thoughts and suggestions.
Apart from all the valid remarks from @gcusello and @richgalloway , think about what you need. If you want to build on the search shown by @gcusello notice that yours groups login attempts by source ...
See more...
Apart from all the valid remarks from @gcusello and @richgalloway , think about what you need. If you want to build on the search shown by @gcusello notice that yours groups login attempts by source (which means that it will show multiple attempts to log in using different usernames but from the same IP as one result) whereas the other one groups by username which means that it will aggregate login attempts to the same account launched from different IPs but split different account login attempts from the same IP as separate results. So it's a question of what you are looking for.
First and foremost - don't receive syslogs directly on your Splunk component (UF or HF/idx). Its performance is sub-par, it doesn't capture reasonable metadata about the transport layer, it doesn't ...
See more...
First and foremost - don't receive syslogs directly on your Splunk component (UF or HF/idx). Its performance is sub-par, it doesn't capture reasonable metadata about the transport layer, it doesn't scale well - as you can see. But if you're trying to do it anyway, the easier way would be to simply send your events to separate ports and associate specific sourcetypes and indexes with specific ports. That's much easier to handle than this overwriting things. But if you insist on doing it this way, the most important thing to remember when analyzing such configurations is that: 1. Transform classes are ordered in alphabetical order (so TRANSFORMS-my_transforms_a will be used _after_ TRANSFORMS-aaaa_my_transforms_zzz) 2. Transforms within a single transform class are used in a left to right order. 3. In a simple configuration (without some fancy ruleset-based entries), _all_ matching transforms are "executed" - the processing isn't stopped just because you've already overwritten some particular metadata field. If any subsequent transform should overwrite the field it will. So in your configuration the TRANSFORMS-ot_nozomi should be executed resulting in some events redirected to the ot_nozomi index, then Splunk would execute the TRANSFORMS-ot_windows and redirect events to ot_windows (even if some of them have already been redirected to the ot_nozomi index - the destination index will get overwritten if they match both regexes).. It _should_ work this way. If it doesn't I'd check if there are no issues with either config file precedence or sourcetype/source/host precedence.
UGH. If you have any say in this - try to force the team responsible for producing these logs to get them in some reasonable format. It's some mix of pseudo-syslog embedded in some pseudo-json, and c...
See more...
UGH. If you have any say in this - try to force the team responsible for producing these logs to get them in some reasonable format. It's some mix of pseudo-syslog embedded in some pseudo-json, and containing some "kinda delimited key/value pairs". It's not gonna end well.
We don't know the whole picture so it's a bit difficult to give precise recommendation as to the license issue. There are at least two diferent solutions here - you could just attach your on-prem DS...
See more...
We don't know the whole picture so it's a bit difficult to give precise recommendation as to the license issue. There are at least two diferent solutions here - you could just attach your on-prem DS to your LM in Azure (but would need to make sure there is proper connectivity of course) or you could obtain a "zero ingest" license from your local Splunk sales team for the DS alone. And you need to point your deployment clients'... deploymentclient.conf to the new DS. (if your infrastructure was well-designed you probably would just deploy a new version of a proper app from the old DS).
OK. You're _not_ waiting on the indexing queue so it doesn't seem to be the issue of backpressure from the disk not able to keep up with the rate of incoming events or any other configured outputs. ...
See more...
OK. You're _not_ waiting on the indexing queue so it doesn't seem to be the issue of backpressure from the disk not able to keep up with the rate of incoming events or any other configured outputs. I'd start by checking the OS-level metrics (cpu usage, ram). If _nothing_ else changed "outside" (amount of events, their ingestion rate throughout the day - not only the general summarized license consumption over the whole day, composition of the ingest stream between (split among different sources, sourcetypes and so on)), something must have changed within your infrastructure. There are no miracles Is this a bare-metal installation or a VM? There could be issues with either oversubscribing resources if that's a VM or even with environment temperature in your DC so your CPUs would get throttled. (yes, I've seen such things). But if the behaviour changed, something must have changed. Question is what.
Hello team, We need to migrate deployment server from azure cloud to on-premise with new IP and Hostname. Please suggest in which .conf file we have to do changes of new IP and Hostname and also we...
See more...
Hello team, We need to migrate deployment server from azure cloud to on-premise with new IP and Hostname. Please suggest in which .conf file we have to do changes of new IP and Hostname and also we need to check the license as deployment server has master license so where this master license file contains ? Kindly suggest.