All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Yeah, the issue I have is that the problem ID is the only common field but by using problem ID I wouldn't return the unlinked Incident data Thanks
Hi @DonBaldini , I'd use OR to avoid subsearches. Anyway, I suppose that the issue is related to the fact thta you're using the incident field that could be null. Please chech the first eval to fi... See more...
Hi @DonBaldini , I'd use OR to avoid subsearches. Anyway, I suppose that the issue is related to the fact thta you're using the incident field that could be null. Please chech the first eval to find a value for the incident field also for the sourcetype "problem". Ciao. Giuseppe
I am analysing Incident to Problem linkage by doing a search of the Incident table and then using a Join to the Problem to get supporting data for linked problems. Problem I have is with Join I am cl... See more...
I am analysing Incident to Problem linkage by doing a search of the Incident table and then using a Join to the Problem to get supporting data for linked problems. Problem I have is with Join I am close to threshold for time periods for the search to fail I have tried to use multisearch and OR search but I need to retain Incident results where there is no problem linked, hope this makes sense, code I have written... | multisearch [search index=servicenow sourcetype="incident" ] [search index=servicenow sourcetype="problem" ] | eval incident=if(sourcetype="incident",number,null), problem=if(sourcetype="incident",dv_problem_id,dv_number) | stats latest(eval(if(sourcetype="incident",dv_opened_at,null()))) as inc_opened, latest(problem) as problem, latest(eval(if(sourcetype="problem",dv_state,null()))) as prb_state by incident
@ITWhispererI attempted to execute your search by my goal is to identify and output the assets that are present in `myinventory` lookup but absent from the `syslog_farm` index.
I am using the Splunk OTEL Collector Helm chart to send logs from my GKE pods to the Splunk Cloud Platform. I have set `UsesplunkIncludeAnnotation` to `true` to filter logs from specific pods. This s... See more...
I am using the Splunk OTEL Collector Helm chart to send logs from my GKE pods to the Splunk Cloud Platform. I have set `UsesplunkIncludeAnnotation` to `true` to filter logs from specific pods. This setup was working fine until I tried to filter the logs being sent. I added the following configuration to my `splunk` values.yaml: config: processors: filter/ottl: error_mode: ignore logs: log_record: - 'IsMatch(body, "GET /status")' - 'IsMatch(body, "GET /healthcheck")' When I applied this configuration, the specified logs were excluded as expected, but it did not filter logs from the specified pods. I am still receiving logs from all my pods, and the annotation is not taking effect. Additionally, the host is not displaying correctly and is showing as "unknown". (I will attach a screenshot for reference.) My questions are: 1. How can I exclude these specific logs more effectively? 2. Is there a more efficient way to achieve this filtering?
Hello, I used Splunk REST API with Search endpoint to be able to retrieve the latest fired alerts based on a title search. I get the fired alerts in alphabetical order but not in chronological orde... See more...
Hello, I used Splunk REST API with Search endpoint to be able to retrieve the latest fired alerts based on a title search. I get the fired alerts in alphabetical order but not in chronological order since all the alerts obtained have the default field <updated>1970-01-01T01:00:00+01:00</updated>. Here's the url and query I used : https://<host>:<mPort>/services/alerts/fired_alerts?search=name%3DSOC%20-*&&sort_dir=desc&sort_key=updated     | rest /services/alerts/fired_alerts/ | search title="SOC - *" | sort -updated | table title, updated, triggered_alert_count, author     Here are the references I used :  Search endpoint descriptions - Splunk Documentation Using the REST API reference - Splunk Documentation So, how can I retrieve fired alerts in chronological order with a title search ? Or how can I obtain a field indicating the date the alert was triggered ? Thanks in advance.
That event does indeed contain in instanceId field, but the sourcetype may not match the "veeam_vbr_syslog" value expected by the DM.  It's hard to tell from the obscured screenshot. Everything in t... See more...
That event does indeed contain in instanceId field, but the sourcetype may not match the "veeam_vbr_syslog" value expected by the DM.  It's hard to tell from the obscured screenshot. Everything in the "constraints" section of the DM must match your data for it to be found by the DM and appear in the dashboard.
Assuming the values of the ip address, hostname and fqdn_hostname are unique in your lookup, you could try something like this index=prod_syslogfarm | append [ | inputlookup myinventory.csv] | eval... See more...
Assuming the values of the ip address, hostname and fqdn_hostname are unique in your lookup, you could try something like this index=prod_syslogfarm | append [ | inputlookup myinventory.csv] | eval host=coalesce(lower(hostname),lower(Hostname)) | eventstats count as host_count by host | eval ip=coalesce(ip_address,IP_Address) | eventstats count as ip_count by ip | eval fqdnhost=coalesce(lower(fqdn_hostname),lower(FQDN_Hostname)) | eventstats count as fqdn_count by fqdnhost | where host_count=1 OR ip_count=1 OR fqdn_count=1 You may need to adjust depending on your actual field names in your index and lookup file
You can find it here.. For list of all config files https://docs.splunk.com/Documentation/Splunk/9.2.2/Admin/Listofconfigurationfiles  
This app is built and supported by Splunk.  i hope you can create support case with Splunk for this.  (as this app is new to me, i dont have any suggestions. lets wait for other community member's ... See more...
This app is built and supported by Splunk.  i hope you can create support case with Splunk for this.  (as this app is new to me, i dont have any suggestions. lets wait for other community member's suggestions)
Hi @aruncp333 ... this task should not any app specific.  Simply search for the particular data and count it, save it as alert with threshold of count >0..  pls let us know if you got the idea or a... See more...
Hi @aruncp333 ... this task should not any app specific.  Simply search for the particular data and count it, save it as alert with threshold of count >0..  pls let us know if you got the idea or any questions.. thanks. 
Either count_err doesn't exist in xxx.csv or no events have a value in id which matches an entry in xxx.csv with a corresponding value in count_err
Hi @yuanliu  It seems that the current query is only retrieving results from the 'myinventory' lookup without performing the intended comparison with the 'asset_inventory' data. It appears that you ... See more...
Hi @yuanliu  It seems that the current query is only retrieving results from the 'myinventory' lookup without performing the intended comparison with the 'asset_inventory' data. It appears that you need to modify the query such that it compares both datasets ('myinventory' and 'asset_inventory') and returns only the discrepancies between the two.
  This is a line of code that takes the fields from the CSV file     |lookup xxx.csv id OUTPUTNEW system time_range      I want to add one field     |lookup xxx.csv id OUTPUTNEW s... See more...
  This is a line of code that takes the fields from the CSV file     |lookup xxx.csv id OUTPUTNEW system time_range      I want to add one field     |lookup xxx.csv id OUTPUTNEW system time_range count_err     When I do this nothing is added, why? I would appreciate your help, thanks
Thank you @richgalloway  But this does not meet my requirement as my syslog data contains combination of hostname, fqdn and IP address and I have to match all these three fields with the respective a... See more...
Thank you @richgalloway  But this does not meet my requirement as my syslog data contains combination of hostname, fqdn and IP address and I have to match all these three fields with the respective asset inventory data which has these fields (Hostname, IP address, FQDN).  So I have to check if the syslog hostname or IP or fqdn is present in the aset inventory data and output only if the syslog data doesn't match with any of these three fields in the asset inventory data.  I have posted this question with examples in this link.  Although I have accepted the answer here https://community.splunk.com/t5/Splunk-Search/How-to-compare-a-look-up-field-with-multivalued-indexed-data-in/m-p/691717#M235509 after further testing, it doesn't seem to be working as expected.
so as i said we are using datamodel with tstats and as tstat we have to use by clause and fields like All_Traffic.src_ip so if the field is not converted before this by clause it can not be used afte... See more...
so as i said we are using datamodel with tstats and as tstat we have to use by clause and fields like All_Traffic.src_ip so if the field is not converted before this by clause it can not be used afterwards. what i did instead, rename the field in data model and using field alies i changed the name to this field. now we can use src_ip instead of data.clientaddr in any search without renaming it. obviously rename command is more hassel free, but as we all know a permenant solution is what evenyone needs
Dear Experts, We are in the latest version of ABAP agent (24.5). In S4HANA system, we noticed a runtime error getting triggered every hour. We identified the related KPI and disabled it. But custom... See more...
Dear Experts, We are in the latest version of ABAP agent (24.5). In S4HANA system, we noticed a runtime error getting triggered every hour. We identified the related KPI and disabled it. But customer needs permanent solution, because it is related to SOST (Mail monitoring) TSV_TNEW_PAGE_ALLOC_FAILED | No more memory available to add rows to an internal table. | SAPLSX11 | LSX11F02 Any idea on permanent solution? Thanks Jananie
Hi @Nawab , you have two solutions: add new fields to you Data Model, I don't like this solution: rename your fields to insert them in the DM fields, this is the prefereable solution. in this w... See more...
Hi @Nawab , you have two solutions: add new fields to you Data Model, I don't like this solution: rename your fields to insert them in the DM fields, this is the prefereable solution. in this way, you can use the DM fields for your searches with tstats. This aliases should be visible both in DMs and in original data, how do you renamed them: in the DM or in the add-on. Do it in the add-on, so you can see them in intersting fields. Ciao. Giuseppe
Try something like this - note that is doesn't deal with All - for that (should you decide it is necessary), you would have to do something a bit more complicated <input type="multiselect" token="ye... See more...
Try something like this - note that is doesn't deal with All - for that (should you decide it is necessary), you would have to do something a bit more complicated <input type="multiselect" token="year"> <label>Year</label> <fieldForLabel>year</fieldForLabel> <fieldForValue>year</fieldForValue> <search> <query>| inputlookup supported_years.csv | dedup year | table year</query> </search> <default>2023</default> <initialValue>2023</initialValue> <change> <eval token="earliest">mvindex(mvsort($form.year$),0)</eval> <eval token="latest">mvindex(mvsort($form.year$),mvcount($form.year$)-1)</eval> <eval token="timeRangeEarliest">strptime($earliest$."0101","%Y%m%d")</eval> <eval token="timeRangeLatest">relative_time(strptime(($latest$)."0101","%Y%m%d"),"+1y")</eval> </change> </input>
We ingested some data from one device which is not add to network traffic datamodel by default. this device sends data in json format. data is added to datamodel but when i use auto extracted fields... See more...
We ingested some data from one device which is not add to network traffic datamodel by default. this device sends data in json format. data is added to datamodel but when i use auto extracted fields and rename that field to already existed field it is still showing original name in interesting fields.   source field = data.clientaddr dest field = src_ip   why i need this to be changed at source level because i want one search to work for all devices. I am using tstats command in search   in interesting fields it is still showing data.clientaddr instead of src_ip