All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Is there any prebuilt search (like rest command) to find the number of triggered alerts for a particular dashboard?  if not, can we create a search which helps in identifying which triggered alert i... See more...
Is there any prebuilt search (like rest command) to find the number of triggered alerts for a particular dashboard?  if not, can we create a search which helps in identifying which triggered alert is associated with which dashboard for a specific time period.
Hi SMEs,   I would like to create an alert on Splunk ES which should trigger if any of the Heavy forwarder reboot or shutdown by someone. thanks in advance 
Thank you @ITWhisperer and @gcusello. It is working now. If anything more is required, I will get back. Thanks again.
Third option of editing in simple XML still works as of today! however the first option no longer does, I get a javascript error. Not to mention the forced xml v=1.0 issue will deprecate this solutio... See more...
Third option of editing in simple XML still works as of today! however the first option no longer does, I get a javascript error. Not to mention the forced xml v=1.0 issue will deprecate this solution option soon. 
Those settings belong in props.conf on the indexers and heavy forwarders. BTW, the TIME_PREFIX setting should describe what comes *before* the timestamp and not the timestamp itself. The inputs.con... See more...
Those settings belong in props.conf on the indexers and heavy forwarders. BTW, the TIME_PREFIX setting should describe what comes *before* the timestamp and not the timestamp itself. The inputs.conf file should look a little like this: [monitor:///path/to/file] index = foo sourcetype = mysourcetype
Hi @bennett_riegel  1. did you download the app as a tar file from the Splunkbase (the file name looks like "splunk-security-essentials_371.tgz") 2. on your Splunk, pls go to  (left side Apps d... See more...
Hi @bennett_riegel  1. did you download the app as a tar file from the Splunkbase (the file name looks like "splunk-security-essentials_371.tgz") 2. on your Splunk, pls go to  (left side Apps dropdown) Apps -- - > Manage Apps --- > Install app from file. 3. then select the tar file("splunk-security-essentials_371.tgz") and load it, it will install smoothly.. 4. then Splunk restart will be required. 
Hi @AL3Z .. Could you pls edit the sample log(remove all important things like ip address, usernames, any sensitive info), thanks.  the props and transforms... it requires some homework from your si... See more...
Hi @AL3Z .. Could you pls edit the sample log(remove all important things like ip address, usernames, any sensitive info), thanks.  the props and transforms... it requires some homework from your side. I will try my best to create and suggest you back, thanks. 
Hi @R15 .. this search runs fine actually. may we know your remaining portions of the search(after calculating the avg, how do you have handle the avg values?!?!)  if you provide a screenshot, that ... See more...
Hi @R15 .. this search runs fine actually. may we know your remaining portions of the search(after calculating the avg, how do you have handle the avg values?!?!)  if you provide a screenshot, that would be of great help, thanks.     
The question is understandable. The answer however is that you can't do that reliably with splunk's built-in functionalities. Splunk processes one event at a time and doesn't keep any state which cou... See more...
The question is understandable. The answer however is that you can't do that reliably with splunk's built-in functionalities. Splunk processes one event at a time and doesn't keep any state which could be carried from one event to another. You can sometimes do some magic with cloning events and cutting different parts from each copy but that hack is ugly, non-scallable and inefficient.
I was building a new search and started getting this error with various functions. I simplified my search down to something straight out of the documentation to make sure I wasn't missing something s... See more...
I was building a new search and started getting this error with various functions. I simplified my search down to something straight out of the documentation to make sure I wasn't missing something silly, but still get the error even with this:  index=* | eval c=avg(1, 2, 3) What's going on?
Getting a ton of these Telemetry errors in Event Log of a windows server with at UF installed. They started a few days ago. What could be causing them? No changes have been made to the UF or splunk i... See more...
Getting a ton of these Telemetry errors in Event Log of a windows server with at UF installed. They started a few days ago. What could be causing them? No changes have been made to the UF or splunk infrastructure recently. 1.6987038408387303e+09 error exporterhelper/queued_retry.go:183 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "name": "signalfx", "error": "Permanent error: \"HTTP/2.0 401 Unauthorized\\r\\nContent-Length: 0\\r\\nDate: Mon, 30 Oct 2023 22:10:40 GMT\\r\\nServer: istio-envoy\\r\\nWww-Authenticate: Basic realm=\\\"Splunk\\\"\\r\\nX-Envoy-Upstream-Service-Time: 5\\r\\n\\r\\n\"", "dropped_items": 50} go.opentelemetry.io/collector/exporter/exporterhelper.(*retrySender).send /builds/o11y-gdi/splunk-otel-collector-releaser/.go/pkg/mod/go.opentelemetry.io/collector@v0.53.0/exporter/exporterhelper/queued_retry.go:183 go.opentelemetry.io/collector/exporter/exporterhelper.(*metricsSenderWithObservability).send /builds/o11y-gdi/splunk-otel-collector-releaser/.go/pkg/mod/go.opentelemetry.io/collector@v0.53.0/exporter/exporterhelper/metrics.go:132 go.opentelemetry.io/collector/exporter/exporterhelper.(*queuedRetrySender).start.func1 /builds/o11y-gdi/splunk-otel-collector-releaser/.go/pkg/mod/go.opentelemetry.io/collector@v0.53.0/exporter/exporterhelper/queued_retry_inmemory.go:119 go.opentelemetry.io/collector/exporter/exporterhelper/internal.consumerFunc.consume /builds/o11y-gdi/splunk-otel-collector-releaser/.go/pkg/mod/go.opentelemetry.io/collector@v0.53.0/exporter/exporterhelper/internal/bounded_memory_queue.go:82 go.opentelemetry.io/collector/exporter/exporterhelper/internal.(*boundedMemoryQueue).StartConsumers.func2 /builds/o11y-gdi/splunk-otel-collector-releaser/.go/pkg/mod/go.opentelemetry.io/collector@v0.53.0/exporter/exporterhelper/internal/bounded_memory_queue.go:69  
I did look into the bin command a bit further and It did help, thanks again! Needed to timechart my data for the latest data of that day as it kept growing and data points were just snapshots of the ... See more...
I did look into the bin command a bit further and It did help, thanks again! Needed to timechart my data for the latest data of that day as it kept growing and data points were just snapshots of the storage of that day. Final code: | bin _time span=12h | stats latest(<storage size>) as <storage size> by _time data_storage | timechart span=12h sum(<storage size>) by data_storage   For the requirement I needed, I just needed to do bin = 1d and span=1d to get daily data trend for the past year of data.  
Hi,  We need to forward XML documents from a UF to indexers that have key fields both in a one-time header  section and in a repeated section that can be repeated up to 100,000 times.  So, for examp... See more...
Hi,  We need to forward XML documents from a UF to indexers that have key fields both in a one-time header  section and in a repeated section that can be repeated up to 100,000 times.  So, for example, the file could look like: <PUBS> <HEADER><Identifier>93234</Identifier> <REPEATSECTION><Balance>8751.23</Balance></REPEATSECTION> <REPEATSECTION><Balance>943.43</Balance></REPEATSECTION> ... note: repeats up to 100,000 times with many many more fields than shown here. Total file size >=300mb... <REPEATSECTION><Balance>123.233</Balance></REPEATSECTION> </PUBS> If the UF breaks events before  <REAPEATSECTION>, then we could have one splunk event per REPEAT section but the fields in the HEADER would not be available.  If the UF sends the whole 300mb file to an indexer,  is there a configuration of props/transforms on the indexer that can create one splunk event per REPEATSECTION but also get the fields from the HEADER section? I'm trying to ask a good question here as best i can.  Does my question make sense to anyone? Thanks!
Thanks!!! @ITWhisperer 
And this is a question about which functionality of which addon/product? All we know is that somewhere (where?) you have some data that is different that you'd expect. We have no clue if this is raw... See more...
And this is a question about which functionality of which addon/product? All we know is that somewhere (where?) you have some data that is different that you'd expect. We have no clue if this is raw data (if so - where it comes from?) or processed (how?).  
Isn't your alert a real-time one by any chance? (Which isn't a very good idea anyway).
Ok. Again. One search gives you a single number. Another search returns several numbers (depending on how many titles you have in your data). What do you want to substract from what? And again - why... See more...
Ok. Again. One search gives you a single number. Another search returns several numbers (depending on how many titles you have in your data). What do you want to substract from what? And again - why extract so many fields when in the end you're just doing stats count?
Ok. So you have some data which might have some format but then again it might not and you want us to find for you something but you don't tell us what it is. How are we supposed to be able to do so ... See more...
Ok. So you have some data which might have some format but then again it might not and you want us to find for you something but you don't tell us what it is. How are we supposed to be able to do so if we neither understand the data nor know what you're looking for?
Hi, My table for VPN connection by a user put MAC address of the user's laptop in place of external IP.  CITY, COUNTRY, REGION, LAT, LON, everything related to the location of the user comes in blan... See more...
Hi, My table for VPN connection by a user put MAC address of the user's laptop in place of external IP.  CITY, COUNTRY, REGION, LAT, LON, everything related to the location of the user comes in blank. Does this show any unusual activity by user?
It should be achievable but not with GUI, not with "just CLI". It requires a lot of bending over backwards to find the buckets, copy them over, rename... Especially if the destination cluster is a pr... See more...
It should be achievable but not with GUI, not with "just CLI". It requires a lot of bending over backwards to find the buckets, copy them over, rename... Especially if the destination cluster is a production environment I wouldn't touch it without testing in a dev environment and a help from a friendly Splunk Consultant (or someone equally knowledgeable). And you can't "copy" just part of the buckets. You'd need to export the raw data and reingest it into the destination cluster.