All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

We are trying to create a dashboard to understand the usage of our application version something like shown below Application Name Version sgs 1.0.18   When we search for particular ind... See more...
We are trying to create a dashboard to understand the usage of our application version something like shown below Application Name Version sgs 1.0.18   When we search for particular index ""sgs1.0.18*" source="/data/wso2/api_manager/current/repository/logs/wso2carbon.log" we get below result. << uri="get api/mydetails/1.0.0/apime/employee-details?correlation-sit=sgs1.0.18u%26h%3d106", SERVICE_PREFIX="get api/mydetails/1.0.0/apime/employee-details?correlation-sit=sgs1.0.18u%26h%3d106", path="get api/mydetails/1.0.0/apime/employee-details?correlation-sit=sgs1.0.18u%26h%3d106", resourceMethod="get", HTTP_METHOD="get", resourceUri="api/mydetails/1.0.0/apime/employee-details?correlation-sit=sgs1.0.18u%26h%3d106" Could you please help us to give sample splunk query to achieve the results .   Thanks    
Hello Splunk Members, Need some help on below queries, -How many calls(read/writing) can we make in Splunk in a given time period(per second)? (Default setting in Splunk. Is it configurable? Max/Mi... See more...
Hello Splunk Members, Need some help on below queries, -How many calls(read/writing) can we make in Splunk in a given time period(per second)? (Default setting in Splunk. Is it configurable? Max/Min value and how is it caluculated) -How much data in a given time period? MB/GB? Is it changeable? Min/Max value -How fast we can make the next insertion? Is there a delay or is it simultaneous? Would this be causing any data loss if there is any connectivity failure or downtime? when using the Splunk enterprise in general and by using HEC method Is there any difference.   Thanks in Advance for
Good morning, I explain my casuistry, I have a Splunk tenant that belongs to a big company with sucusarles in three zones. Each branch should only see the data of its zone. The indexes are construct... See more...
Good morning, I explain my casuistry, I have a Splunk tenant that belongs to a big company with sucusarles in three zones. Each branch should only see the data of its zone. The indexes are constructed in the form, zone_technology, for example, eu_meraki. Knowing this, I have created a series of alerts, which are shared for all the areas, and search in all the indexes. How could I make that the warning email when the alert is triggered, only reaches the contacts of an area?   Thank you
Okay Thanks for the suggestions.. i will reach out to our exchange team and see if they can provide a solution and will post the outcome here. Thanks.  
@inventsekar @dtburrows3   Thank you both for your reply. I was trying to use the spath command but was failing in extraction. @dtburrows3 : The second method using for loop worked well. I am runn... See more...
@inventsekar @dtburrows3   Thank you both for your reply. I was trying to use the spath command but was failing in extraction. @dtburrows3 : The second method using for loop worked well. I am running this query against large set of query. Does using for loop and json functions has any limitation in that case ? Like results getting truncated and so ?
Thanks for your help.   In my project there is no Cluster master, Deployment Server have, can i user deployment server as Cluster master? SearchHead and Indexer are single instance.   Regards, ... See more...
Thanks for your help.   In my project there is no Cluster master, Deployment Server have, can i user deployment server as Cluster master? SearchHead and Indexer are single instance.   Regards, Vij  
you rock amt, this works a treat!
Please add this to your inputs.conf and restart Splunk Service on UF. crcSalt = <SOURCE>  and update the test log, then check if the Splunk indexer still have redundant logs.    regarding the "re... See more...
Please add this to your inputs.conf and restart Splunk Service on UF. crcSalt = <SOURCE>  and update the test log, then check if the Splunk indexer still have redundant logs.    regarding the "read from beginning", i was bit confused with the other topic today morning.. monitoring the archive files. more details here: https://docs.splunk.com/Documentation/Splunk/9.1.2/Data/Monitorfilesanddirectories  
Hi @syaseensplunk, if this is a sample of the logs to filter, the regex in the transforms.conf doesn't match any event. You have to use a different regex tat has a match with the events, e.g. somet... See more...
Hi @syaseensplunk, if this is a sample of the logs to filter, the regex in the transforms.conf doesn't match any event. You have to use a different regex tat has a match with the events, e.g. something like this: REGEX = \"app\":\"splunk-kubernetes-objects\" or a different one, that you can test at https://regex101.com/r/GnkJqh/1  Ciao. Giuseppe
Sure, here is the configuration of my inputs.conf file [tcpout:// <ip-address>:<port>] [monitor://C:\Users\admin\Desktop\practicelogs.txt] disabled = 0 index = practicelogs sourcetype = pr... See more...
Sure, here is the configuration of my inputs.conf file [tcpout:// <ip-address>:<port>] [monitor://C:\Users\admin\Desktop\practicelogs.txt] disabled = 0 index = practicelogs sourcetype = practicelogs i didnt understand what yo meant by read from beginning. can you please elaborate on that, Thanks.
Hi @tahaahmed354  Looks like you may have mistakenly configured the read from beginning everytime.  To Troubleshoot this issue, could you please copy paste the inputs.conf from your windows UF (onl... See more...
Hi @tahaahmed354  Looks like you may have mistakenly configured the read from beginning everytime.  To Troubleshoot this issue, could you please copy paste the inputs.conf from your windows UF (only the required portion is enough, remove any sensitive values), thanks.   
Hi @Poojitha  The Splunk command "spath" enables you to extract information from the structured data formats XML and JSON. The command Ref doc link is: https://docs.splunk.com/Documentation/Splunk/... See more...
Hi @Poojitha  The Splunk command "spath" enables you to extract information from the structured data formats XML and JSON. The command Ref doc link is: https://docs.splunk.com/Documentation/Splunk/9.1.2/SearchReference/Spath   Pls let us know if you are able to use the spath command (as seen in the previous reply) or you could use direct "rex" command extract field values and do the stats.   but, the spath is the simplest option i think. pls let us know if you are ok with spath or not, thanks. 
I am using a single universal forwarder on my windows machine to send a log file to my Splunk host machine deployed on Ubuntu.  The problem is that there were 3 logs events initially in the file, ... See more...
I am using a single universal forwarder on my windows machine to send a log file to my Splunk host machine deployed on Ubuntu.  The problem is that there were 3 logs events initially in the file, and splunk read those events and displayed on the dashboard. But when I appended the same file and added 10 more events manually, the dashboard is giving out 16 log events when there are only 13 events in the log file. its is reading the first three logs twice. How to resolve this issue?  
You can try something like this.       <base_search> | eval error=coalesce(spath(response, "errors{}"), spath(response, "errors")) | fields - response ``` extract variables fr... See more...
You can try something like this.       <base_search> | eval error=coalesce(spath(response, "errors{}"), spath(response, "errors")) | fields - response ``` extract variables from the error messages ``` | rex field=error "(?i)sub\s+\'(?<sub>[^\']+)\'" | rex field=error "(?i)product\s+id\s+(?<product_id>[^\s]+)" | rex field=error "(?i)location\s+id\s+(?<location_id>[^\s]+)" | rex field=error "(?i)datetime\s+(?<start_datetime>\w+\s+\d{4}(?:\-\d{2}){2}T\d{2}(?:\:\d{2}){2}(?:\+|\-)\d{2}\:\d{2})" ``` replace variables in the error messages to get a standardized set of error messages to do counts against ``` | eval error=replace(replace(replace(replace(error, "(?i)sub\s+\'([^\']+)\'", "sub '***'"), "(?i)product\s+id\s+([^\s]+)", "product id ***"), "(?i)location\s+id\s+([^\s]+)", "location id ***"), "(?i)datetime\s+(\w+\s+\d{4}(?:\-\d{2}){2}T\d{2}(?:\:\d{2}){2}(?:\+|\-)\d{2}\:\d{2})", "datetime ***") ``` stats aggregation to get counts of error messages ``` | stats count as count, values(sub) as sub, values(product_id) as product_id, values(location_id) as location_id, values(start_datetime) as start_datetime by error        Results should look something like this. You can see the counts next to the standardized error messages. Also went ahead and carried over all the variables that were replaced in error messages for context. You could also check out the cluster command as this will give you similar results without having to do all the extractions and replacements in inline SPL.     <base_search> | table _time, response | eval error=coalesce(spath(response, "errors{}"), spath(response, "errors")) | fields - response | cluster field=error t=0.4 showcount=true countfield=count      Results will look like this. The error messages aren't redacted but their counts do line up pretty well to the previous example so the clustering appears to work decently. You can read up more on the cluster command here. https://docs.splunk.com/Documentation/Splunk/9.1.2/SearchReference/Cluster
Hi @yuvaraj_m91  The Splunk command "spath" enables you to extract information from the structured data formats XML and JSON  Command Ref is given here: https://docs.splunk.com/Documentation/Splun... See more...
Hi @yuvaraj_m91  The Splunk command "spath" enables you to extract information from the structured data formats XML and JSON  Command Ref is given here: https://docs.splunk.com/Documentation/Splunk/9.1.2/SearchReference/Spath   Pls let us know if you are able to use the spath command.  or you could use direct "rex" command extract field values and do the stats  or where like command also should be good i think.    but, the spath is the simplest option i think. pls let us know if you are ok with spath or not, thanks. 
i have all the below messages in the "response" field. {"errors": ["Message: Payment failed. Reason: Hi, we attempted to process the transaction but it seems there was an error. Please check your ... See more...
i have all the below messages in the "response" field. {"errors": ["Message: Payment failed. Reason: Hi, we attempted to process the transaction but it seems there was an error. Please check your information and try again. If the problem persists please contact your bank."]} {"errors": ["Unable to retrieve User Profile with sub '2415d' as it does not exist"]} {"errors": ["Unable to retrieve User Profile with sub 'dfadf' as it does not exist"]} {"errors": ["Unable to retrieve User Profile with sub 'fdsgad' as it does not exist"]} {"errors": ["Unallocated LRW seat not found with product id fdafdsaddsfa and start datetime utc 2024-01-06T05:30:00+00:00 and test location id dfafdfa"]} {"errors": ["Unallocated LRW seat not found with product id sfgdfa and start datetime utc 2024-01-06T05:30:00+00:00 and test location id dsfadfsa"]} I wanted to display the result with the count as  Message: Payment failed. Reason: Hi, we attempted to process the transaction but it seems there was an error. Please check your information and try again. If the problem persists please contact your bank. Unable to retrieve User Profile with sub '***' as it does not exist Unallocated LRW seat not found with product id *** and start datetime utc 2024-01-06T05:30:00+00:00 and test location id ***
I dont know the complete path to the nested tags array but you can do something like this to target the value contained within the Contact Key in the MV json fields. Something like this.     <b... See more...
I dont know the complete path to the nested tags array but you can do something like this to target the value contained within the Contact Key in the MV json fields. Something like this.     <base_search> | eval tags_json=spath(_raw, "tags{}"), contact=case( mvcount(tags_json)==1, if(spath(tags_json, "Key")=="Contact", spath(tags_json, "Value"), null()), mvcount(tags_json)>1, mvmap(tags_json, if(spath(tags_json, "Key")=="Contact", spath(tags_json, "Value"), null())) ) | fields + _time, _raw, tags_json, contact     Below is a screenshot of an example on my local instance. First we extract all json objects from the tags array as a multivalued field named "tags_json". From there you can use the mvmap() function to loop through the multivalue field and check each entry to see if the Key field value of the json object is equal to "Contact". If it is, then we know this is the json object we want to target the extraction of the "Value" key from. So we do an Spath specificly on that object and store the returned value as a field named "contact". Option 2: Another route to take (depending on the structure of your event and if it make sense to do i this way). We can loop through each json object in the tags array and stuff the key/values into a temporary json object that we can then do a full spath against. This is a more exhaustive approach as apposed to the targeted one in the previous example. SPL to do this would look something like this. <base_search> | eval ``` extract array of json objects a multivalued field ``` tags_json=spath(_raw, "tags{}"), ``` initialize the temporary json object that will hold all the key/value pairs contained within the tags array ``` final_tag_json=json_object() ``` use the mode=multivalue foreach loop to loop through each entry in the multivalued field ``` | foreach mode=multivalue tags_json [ | eval ``` json_set() function will set up each Key/Value as a new key/value pair in the temporary json "final_tag_json" ``` final_tag_json=json_set(final_tag_json, spath('<<ITEM>>', "Key"), spath('<<ITEM>>', "Value")) ] | fields - tags_json ``` full spath against the final_tag_json field ``` | spath input=final_tag_json | fields - final_tag_json | fields + _time, _raw, Contact, Name You can see in this screenshot that not only is the "Contact" field extracted but "Name" value is extracted as well, this method would loop through each json array and extract a new Key/Value pair for each entry. Below is a screenshot showing what the temporary final_tag_json object looks like that we did the full spath against for context.  
Hi All, I have a multivalue field that contains nested key value pair with key named as "Key" and Value named as "Value". Example snippet : tags: [ [-] { [-] Key: Contact V... See more...
Hi All, I have a multivalue field that contains nested key value pair with key named as "Key" and Value named as "Value". Example snippet : tags: [ [-] { [-] Key: Contact Value: abc@gmail.com } { [-] Key: Name Value: abc } I want to extract only the Contact value from here i.e abc@gmail.com. I am trying with multivalue functions and spath. Still stuck here. Please help me. Regards, PNV
Hi,  Just thought to update you(and all others), as per the doc https://docs.splunk.com/Documentation/Splunk/9.1.2/Troubleshooting/Usebtooltotroubleshootconfigurations the right command is: splunk... See more...
Hi,  Just thought to update you(and all others), as per the doc https://docs.splunk.com/Documentation/Splunk/9.1.2/Troubleshooting/Usebtooltotroubleshootconfigurations the right command is: splunk btool inputs list  thanks. 
>>> i would like to know how to install btool on windows when we install Splunk, the btool automatically installs with the installation. From your question i understand that you are looking for "how... See more...
>>> i would like to know how to install btool on windows when we install Splunk, the btool automatically installs with the installation. From your question i understand that you are looking for "how to run btool on windows". >>> i was trying to open in windows as an administrator and I could get the results. Just to make sure you are running the cmd prompt with admin rights, pls check if the top left on cmd prompt shows as "Administrator: Command Prompt" >>> C:\Program Files\Splunk\bin>splunk btool inputs list 'splunk' is not recognized as an internal or external command, operable program or batch file. Please suggest us, if you installed splunk on the default path or did you install on custom path. thanks.