All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

these are seperated events not a multivalue fields
The wildcard need to be defined in the lookup e.g. abc* will match abc and abc_123
Again, are these separate events or multi-value fields in the same event?
Not without combining them into a single event - this is usually done with some sort of stats command e.g. stats, eventstats, streamstats, etc Depending on what you are trying to do and how the data ... See more...
Not without combining them into a single event - this is usually done with some sort of stats command e.g. stats, eventstats, streamstats, etc Depending on what you are trying to do and how the data is represented in your events, there could be a number of ways to do this.
Hi @RSS_STT , in the same option of the same section try with WILDCARD instead CIDR. Ciao. Giuseppe
what if i want to match host_name= abc & host_name=abc_123 which is in lookup file.
Hi @RSS_STT , in [Settings > Lookups > Lookup Definitions ] open "Advanced Options" and configure CIDR as match_type, as described at https://docs.splunk.com/Documentation/Splunk/9.4.0/Knowledge/Add... See more...
Hi @RSS_STT , in [Settings > Lookups > Lookup Definitions ] open "Advanced Options" and configure CIDR as match_type, as described at https://docs.splunk.com/Documentation/Splunk/9.4.0/Knowledge/Addfieldmatchingrulestoyourlookupconfiguration Ciao. Giuseppe
Can i do the wildcard matching in lookup? |makeresults |eval ip=192.168.101.10 |lookup ip.csv ip output host In my lookup i have two entry ip=192.168.101.10 & ip=192.168.101.10/24. How can i add... See more...
Can i do the wildcard matching in lookup? |makeresults |eval ip=192.168.101.10 |lookup ip.csv ip output host In my lookup i have two entry ip=192.168.101.10 & ip=192.168.101.10/24. How can i add wildcard (*) for match and i should get two entry.  
That's it! I hadn't even considered that. Thank you so much!   But given Where only works one event at a time, does that mean it can't be used to compare fields in two different sourcetypes?
NO they are not quoted string seperated by comma..i think i dint put the example in the right way..let me try the below example field_a          field_b rohan               rohan rahul            ... See more...
NO they are not quoted string seperated by comma..i think i dint put the example in the right way..let me try the below example field_a          field_b rohan               rohan rahul                rahul raj now i need to have the difference of the above to field field_c raj
The where command (as with most commands) works on one event at a time - since your events are coming from different sourcetypes, they will be different events so the where command doesn't find any m... See more...
The where command (as with most commands) works on one event at a time - since your events are coming from different sourcetypes, they will be different events so the where command doesn't find any matches.
Strange, that's exactly what I tried before posting, but it still resulted in 0 hits whereas a wildcard got me the results I was looking for. For the sake of experimentation, I changed the eval to:  ... See more...
Strange, that's exactly what I tried before posting, but it still resulted in 0 hits whereas a wildcard got me the results I was looking for. For the sake of experimentation, I changed the eval to:  | eval src_mac_{index}=src_mac   Making this change, there would be no illegal chacters in the field name, only a-z plus the underscore. Despite that, the search still didn't function properly. Furthermore, single quotes causes the search not to match anything regardless of whether I use a wildcard or not. It has to be double quotes.
Tokens are usually passed as query parameters on URLs. Without knowing how your javascript is creating or using the tokens, it is a bit difficult to be more definitive.
Try not introducing an "illegal" character in the first place! | eval src_mac_{sourcetype}=src_mac Also, use single quotes around the field name, particularly on the right-hand side of the eval | ... See more...
Try not introducing an "illegal" character in the first place! | eval src_mac_{sourcetype}=src_mac Also, use single quotes around the field name, particularly on the right-hand side of the eval | WHERE upper('src_mac-bro_known_devices') = upper('src_mac-ise:syslog')
Please clarify the contents of these fields. Are they quoted strings separated by commas? Are they multi-value fields which each value being an unquoted string? Are they difference events with differ... See more...
Please clarify the contents of these fields. Are they quoted strings separated by commas? Are they multi-value fields which each value being an unquoted string? Are they difference events with different values for the fields? The best way to illustrate your data is to paste the raw event data into a code block (using the </> button) so that formatting from the event is preserved.
@yuanliu @marnall @gcusello Kindly refer below query output:       index=test_event source=/applications/hs_cert/cert/log/cert_monitor.log host="appu1.com" | head 1 | rex field=_raw "(?<Severity>... See more...
@yuanliu @marnall @gcusello Kindly refer below query output:       index=test_event source=/applications/hs_cert/cert/log/cert_monitor.log host="appu1.com" | head 1 | rex field=_raw "(?<Severity>[^\|]+)\|(?<Hostname>[^\|]+)\|(?<CertIssuer>[^\|]+)\|(?<FilePath>[^\|]+)\|(?<Status>[^\|]+)\|(?<ExpiryDate>[^\|\s]+)" | multikv forceheader=1 | table Severity Hostname CertIssuer FilePath Status ExpiryDate       Actual Output: Severity Hostname CertIssuer FilePath Status ExpiryDate INFO appu1.com rootca13 /applications/hs_cert/cert/live/h_core.jks Valid 2026-10-18 INFO appu1.com rootca13 /applications/hs_cert/cert/live/h_core.jks Valid 2026-10-18 INFO appu1.com rootca13 /applications/hs_cert/cert/live/h_core.jks Valid 2026-10-18   Expected Output: Severity Hostname CertIssuer FilePath Status ExpiryDate INFO appu1.com rootca13 /applications/hs_cert/cert/live/h_hcm.jks Valid 2026-10-18 ALERT appu1.com rootca12 /applications/hs_cert/cert/live/h_hcm.jks Expired 2020-10-18 INFO appu1.com key /applications/hs_cert/cert/live/h_hcm.jks Valid 2025-06-14 INFO appu1.com rootca13 /applications/hs_cert/cert/live/h_core.jks Valid 2026-10-18 ALERT appu1.com rootca12 /applications/hs_cert/cert/live/h_core.jks Expired 2020-10-18 INFO appu1.com key /applications/hs_cert/cert/live/h_core.jks Valid 2025-03-22 Refer the above Actual output section, it only showing first line of the block and repeated the same 3 times. This is not the expected output of the event block shared.  Can you please suggest if any mistake in the rex field used? Refer below scenario as well:       index=test_event source=/applications/hs_cert/cert/log/cert_monitor.log host="appu1.com" | rex field=_raw "(?<Severity>[^\|]+)\|(?<Hostname>[^\|]+)\|(?<CertIssuer>[^\|]+)\|(?<FilePath>[^\|]+)\|(?<Status>[^\|]+)\|(?<ExpiryDate>[^\|\s]+)" | table Severity Hostname CertIssuer FilePath Status ExpiryDate     I am seeing only first row output from each event block. (Instead of showing all rows from the log file.) Severity Hostname CertIssuer FilePath Status ExpiryDate INFO appu1.com rootca13 /applications/hs_cert/cert/live/h_hcm.jks Valid 2026-10-18 INFO appu1.com rootca13 /applications/hs_cert/cert/live/h_core.jks Valid 2026-10-18 Kindly suggest. 
Hello, I'm setting up StatsD to send custom metrics from an AWS EC2 instance, where the Splunk OpenTelemetry Collector is running to Splunk Observability Cloud. I've configured StatsD as a receiver... See more...
Hello, I'm setting up StatsD to send custom metrics from an AWS EC2 instance, where the Splunk OpenTelemetry Collector is running to Splunk Observability Cloud. I've configured StatsD as a receiver using guidelines from the https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/receiver/statsdreceiver. Here's my configuration for StatsD configured in the agent_config.yaml file. receivers: statsd: endpoint: "localhost:8125" aggregation_interval: 60s enable_metric_type: false is_monotonic_counter: false timer_histogram_mapping: - statsd_type: "histogram" observer_type: "histogram" histogram: max_size: 50 - statsd_type: "distribution" observer_type: "histogram" histogram: max_size: 50 - statsd_type: "timing" observer_type: "summary" The GitHub documentation provides exporter configurations, but I'm unsure how to implement them effectively. As per the github document below is mentioned. exporters: file: path: ./test.json service: pipelines: metrics: receivers: [statsd] exporters: [file] Below is the receivers configuration which I am setting in the service configuration section in the in the agent_config.yaml as mentioned below: service: pipelines: metrics: receivers: [hostmetrics, otlp, signalfx, statsd] processors: [memory_limiter, batch, resourcedetection] exporters: [signalfx] When I add "statsd" ("receivers: [hostmetrics, otlp, signalfx, statsd]" and "exporters: [signalfx]") as one of the more receivers as mentioned above and restart the "systemctl restart splunk-otel-collector.service", splunk otel collector agent stop sending any metric to the Splunk Observability Cloud and when I remove statsd (receivers: [hostmetrics, otlp, signalfx]) then splunk otel collector agent starts sending any metric to the Splunk Observability Cloud. What should be correct/supported ad receiver/exporter to be configured in the service section for the statsd? Thanks
HI all I have a scenario where i have to find the difference of two field value (string) for example fileda="raj", "rahul", "rohan" filedb="rahul", "rohan" i need to have a third field as differe... See more...
HI all I have a scenario where i have to find the difference of two field value (string) for example fileda="raj", "rahul", "rohan" filedb="rahul", "rohan" i need to have a third field as difference of the above two filed fieldc="raj" I am running out of ideas as how to do it. Can someone please help in this  
you refer the screenshot, it is showing events in one complete block. I simply want to show the latest available log output through the splunk dashboard. The queries shared above is only referring t... See more...
you refer the screenshot, it is showing events in one complete block. I simply want to show the latest available log output through the splunk dashboard. The queries shared above is only referring to the head 1 line of each event block. That seems to be an incorrect output.  This is even more confusing.  The latest available log output is exactly head 1, which @marnall already gives.  Could you illustrate what's the difference between that search output and the output you wanted? (No screenshot.  Use text, text table, etc.) To help yourself, here are four golden rules that I call four commandments of asking an answerable question: Illustrate data input (in raw text, anonymize as needed), whether they are raw events or output from a search (SPL that volunteers here do not have to look at). Illustrate the desired output from illustrated data. Explain the logic between illustrated data and desired output without SPL. If you also illustrate attempted SPL, illustrate actual output and compare with desired output, explain why they look different to you if that is not painfully obvious.
Already use 3 browser. Chrome, Brave, and Edge but still same result.