All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thank you for the detailed  answer its really helpful
As @bowesmana exemplifies, putting your complete set of values in a lookup is one way to count "missing" values.  Another way to is to put them in an multivalued field and use this field for counting... See more...
As @bowesmana exemplifies, putting your complete set of values in a lookup is one way to count "missing" values.  Another way to is to put them in an multivalued field and use this field for counting.  Here is an example   index=idx1 host=host1 OR host=host2 source=*filename*.txt field1!=20250106 (field2="20005") OR (field2="20006") OR (field2="20007") OR (field2="666") | eval field2prime = mvappend("20005", "20006", "20007", "666") | mvexpand field2prime | eval field2match = if(field2 == field2prime, 1, 0) | stats sum(field2match) as count by field2prime | rename field2prime as field2   Here is an emulation you can play with and compare with real data:   | makeresults count=16 | streamstats count as _count | eval field2 = round(_count / % 3 + 20005 ``` the above emulates index=idx1 host=host1 OR host=host2 source=*filename*.txt field1!=20250106 (field2="20005") OR (field2="20006") OR (field2="20007") OR (field2="666") ```   Mock data looks like this: fiel2 20005 20005 20005 20006 20006 20006 20006 20006 20006 20006 20006 20007 20007 20007 20007 20007 If you count this against field2 directly, you get field2 count 20005 3 20006 8 20007 5 Using the above search, the result is field2 count 20005 3 20006 8 20007 5 666 0
Getting following Static Errors in Splunk SOAR PR review from Bot. 1.  { "minimal_data_paths": { "description": "Checks to make sure each action includes the min... See more...
Getting following Static Errors in Splunk SOAR PR review from Bot. 1.  { "minimal_data_paths": { "description": "Checks to make sure each action includes the minimal required data paths", "message": "One or more actions are missing a required data path", "success": false, "verbose": [ "Minimal data paths: summary.total_objects_successful, action_result.status, action_result.message, summary.total_objects", " action one is missing one or more required data path", " action two is missing one or more required data path", " action three is missing one or more required data path" ] } },  I have provided all the data paths in output array in <App Name>.json file. Is there any other place where I have to provide the data paths? 2. { "repo_name_has_expected_app_id": { "description": "Validates that the app ID in the app repo's JSON file matches the recorded app ID for the app", "message": "Could not find an app id for <App Name>. Please add the app id for <App Name> to data/repo_name_to_appid.json", "success": false, "verbose": [ "Could not find an app id for <App Name>. Please add the app id for <App Name> to data/repo_name_to_appid.json" ] } } How do we resolve this issue,  did I missed any file?
Thank you for your reply. Could you tell me how to set up indexes in a private subnet without using an NLB, and how to configure forwards?
What field does your data contain that holds the sensor value? Did you change the query as needed to pick up that field.  
I can manually count and see that there are x # of sensors setup per hostname.  You need to show volunteers here HOW do you count number of sensors from logs (without using SPL). Here are four co... See more...
I can manually count and see that there are x # of sensors setup per hostname.  You need to show volunteers here HOW do you count number of sensors from logs (without using SPL). Here are four commandments to help you ask answerable questions in this forum: Illustrate data input (in raw text, anonymize as needed), whether they are raw events or output from a search (SPL that volunteers here do not have to look at). Illustrate the desired output from illustrated data. Explain the logic between illustrated data and desired output without SPL. If you also illustrate attempted SPL, illustrate actual output and compare with desired output, explain why they look different to you if that is not painfully obvious.
Yes, it returned 0s
Did you try the query I posted?
You can't count non-existence of a field value if that value does not exist unless you know what values are expected - that is generally termed the 'proving the negative' in these forums. You would ... See more...
You can't count non-existence of a field value if that value does not exist unless you know what values are expected - that is generally termed the 'proving the negative' in these forums. You would typically have a lookup file of the expected values for field 2, e.g. if you have a csv with field2 having 2 values 666 and 999 and in your search you get field2 for value 999 has N results but no 666 results, then this at the end will add a 0 for all missing expected values | inputlookup append=t field2.csv | stats max(count) as count by field2 | fillnull field2
Lets say I have a dashboard setup with 5 hosts (serverA, serverB, serverC, serverD, serverE), for each host there are 5-10 queries setup to pull data using the same index=idx_sensors. I can manually... See more...
Lets say I have a dashboard setup with 5 hosts (serverA, serverB, serverC, serverD, serverE), for each host there are 5-10 queries setup to pull data using the same index=idx_sensors. I can manually count and see that there are x # of sensors setup per hostname.  How would I create a query to check how many sensors are being monitored by hostname?  (I've got 7 diff dashboards w/ multiple hosts monitoring X number of sensors. I need to get metrics for which host has how many sensors that are currently being monitored.)  
You can use rex, but your example is not entirely clear - you are expecting - and | and / characters in your output? See the rex statement in this example with your data. | makeresults format=csv d... See more...
You can use rex, but your example is not entirely clear - you are expecting - and | and / characters in your output? See the rex statement in this example with your data. | makeresults format=csv data="raw 00012243asdsfgh - No recommendations from System A. Message - ERROR: System A | No Matching Recommendations 001b135c-5348-4arf-b3vbv344v - Validation Exception reason - Empty/Invalid Page_Placement Value ::: Input received - Channel1; ::: Other details - 001sss-445-4f45-b3ad-gsdfg34 - Incorrect page and placement found: Channel1; 00assew-34df-34de-d34k-sf34546d :: Invalid requestTimestamp : 2025-01-21T21:36:21.224Z 01hg34hgh44hghg4 - Exception while calling System A - null" | rex field=raw max_match=0 " (?<words>[A-Za-z]+)" | eval words = mvjoin(words, " ")  
what is the json syntax? documentation is not clear
Use dc index=idx_sensors sourcetype = sensorlog | stats dc(sensor_field) as sensors by host
Can you give sample of your events? You could add another or more field after by on stats if there is something which you could use.
Calculating metrics. I need to count the number of sensors that are created and monitored for each host. I have the index and sourcetype. I created about 7 different dashboards with multiple host on... See more...
Calculating metrics. I need to count the number of sensors that are created and monitored for each host. I have the index and sourcetype. I created about 7 different dashboards with multiple host on each dashboard and I need to get a count on the number of sensors that are being monitored by each host.  index=idx_sensors sourcetype = sensorlog | stats count by host the above query is giving me all the hostnames that are being monitored but the count is giving me all the events... I just need the # of sensors per host.   
@jkamdar Please follow this  https://docs.splunk.com/Documentation/Forwarder/9.4.0/Forwarder/Installanixuniversalforwarder 
@jkamdar  Yes, please replace the user while using chown. If you still face issues, it might be necessary to check with the OS team to determine if there are any permission-related problems
Hi everyone, I'm running a query in Splunk using the dbxquery command and received the following error:   Error in 'script': Getinfo probe failed for external search command 'dbxquery'.   When I... See more...
Hi everyone, I'm running a query in Splunk using the dbxquery command and received the following error:   Error in 'script': Getinfo probe failed for external search command 'dbxquery'.   When I check Apps -> Manage Apps -> Splunk DB Connect, I see the version is 2.4.0. Please help me identify the cause and how to fix this error. Thank you!
Thank you for your response, it has solved my problem!
Not sure if I fully understand the requirement.  But in general, you can assign a non-null string to those fields.  For example, | eval MX = coalesce(MX, "MX is null") The issue, I suspect, is when... See more...
Not sure if I fully understand the requirement.  But in general, you can assign a non-null string to those fields.  For example, | eval MX = coalesce(MX, "MX is null") The issue, I suspect, is when you transpose, all those values representing null will collapse and skew format.  Is this the problem?  If so,  you can force these values to be different, e.g., | eval MX = coalesce(MX, "MX is null for " . FQDN) Hope. this helps.