All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

| stats max(avg_io_wait_time) as avg_io_wait_time by host | sort avg_io_wait_time | streamstats c as severity | eval host = printf("%*s", len(host) + severity, host) | stats max(avg_io_wait_time)... See more...
| stats max(avg_io_wait_time) as avg_io_wait_time by host | sort avg_io_wait_time | streamstats c as severity | eval host = printf("%*s", len(host) + severity, host) | stats max(avg_io_wait_time) as avg_io_wait_time by host
This worked smooth: | stats max(avg_io_wait_time) as avg_io_wait_time by host | sort avg_io_wait_time | streamstats c as severity | eval host = printf("%*s", len(host) + severity, host) | stats ... See more...
This worked smooth: | stats max(avg_io_wait_time) as avg_io_wait_time by host | sort avg_io_wait_time | streamstats c as severity | eval host = printf("%*s", len(host) + severity, host) | stats max(avg_io_wait_time) as avg_io_wait_time by host    
I am unable to see uploaded tab on Free trial Splunk cloud instance
Do you face this issue for all SAML user? Or only for a specific user? 
My problem was solved by creating a private app with a customized props.conf file, which defines different TZ for different hosts like showed as below: [host::hostA] TZ = xxx [host::hostB] TZ = xxx
Hi @elend , you are working on Datamodels, so the only approach is to creater a calculated field that, when the DM is populated, it takes a value when a field is empty, e.g.: | eval destination=if(... See more...
Hi @elend , you are working on Datamodels, so the only approach is to creater a calculated field that, when the DM is populated, it takes a value when a field is empty, e.g.: | eval destination=if(isempty(destination),"unknown",destination) but you have to do this as a calculated field to use in the population searcjh, not in the same search. Then you have to do this for all your fields. Ciao. Giuseppe
Thanks for the answer. Does scheduled task also start again on splunk restart? 
Hi @bhaskar5428 , it should be already extracted because Splunk recognizes the pais:field=value; anyway, you could try this regex: orderId\=(?<orderId>[^\]]+) that you can test at https://regex101... See more...
Hi @bhaskar5428 , it should be already extracted because Splunk recognizes the pais:field=value; anyway, you could try this regex: orderId\=(?<orderId>[^\]]+) that you can test at https://regex101.com/r/OtSfRS/1 Ciao. Giuseppe 
orderId=(?<orderid>[^\]]+)\]
Process transaction locally [idempotencyId=27cb55d0-3844-4e8f-8c4b-867ed64610a220240821034250387S39258201QE, deliveringApplication=MTNA0002, orderId=8e1d1fc0-5fe2-4643-bc1f-12debe6a7a06]     i wou... See more...
Process transaction locally [idempotencyId=27cb55d0-3844-4e8f-8c4b-867ed64610a220240821034250387S39258201QE, deliveringApplication=MTNA0002, orderId=8e1d1fc0-5fe2-4643-bc1f-12debe6a7a06]     i would like to extract Order Id from above sample data  which is = 8e1d1fc0-5fe2-4643-bc1f-12debe6a7a06   Pls suggest
is it possible to make the null value filled with some value so it still counted?. i search for this option and there is some solution - made change on props conf to eval the null value - use tstat... See more...
is it possible to make the null value filled with some value so it still counted?. i search for this option and there is some solution - made change on props conf to eval the null value - use tstats ... fillnull_value="null" is there other option or best approach for this?
Mmm, I think the problem is that the min/max applies to the entire dataset rather than per series, because if you don't use trellis, there is only min/max for the entire chart, not per series.  
If you wanna filter out all other events please try props.conf [sourcetype] TRANSFORMS-filter = setnull,stanza transforms: [setnull] REGEX = . DEST_KEY = queue FORMAT = nullQueue [stanz... See more...
If you wanna filter out all other events please try props.conf [sourcetype] TRANSFORMS-filter = setnull,stanza transforms: [setnull] REGEX = . DEST_KEY = queue FORMAT = nullQueue [stanza] REGEX = "Snapshot created successfully" DEST_KEY = queue FORMAT = indexQueue  
Please specific the parameter for both stanzas as below shown and let me know how did you apply the inputs.conf? Via the deploymentserver or locally? Please share the whole path of the settings.   ... See more...
Please specific the parameter for both stanzas as below shown and let me know how did you apply the inputs.conf? Via the deploymentserver or locally? Please share the whole path of the settings.   [WinEventLog://Directory Service] checkpointInterval = 5 current_only = 0 disabled = 0 index = <your_index> start_from = oldest [WinEventLog://DNS Server] checkpointInterval = 5 current_only = 0 disabled = 0 index = <your_index> start_from = oldest
Hello everyone ,  I want to filter data for a specific keyword "Snapshot created successfully " from a log file but i am getting other events also along with the searched keywords. My entries in pr... See more...
Hello everyone ,  I want to filter data for a specific keyword "Snapshot created successfully " from a log file but i am getting other events also along with the searched keywords. My entries in props.conf and transform.conf is as below :   props.conf [sourcetype] TRANSFORMS-filter = stanza transforms.conf [stanza] REGEX = "Snapshot created successfully" DEST_KEY = queue FORMAT = indexqueue Is there any issue here ?
It is somewhat confusing what that mvexpand is supposed to do and why string merge is necessary.  As I last commented in your other post, there is nothing wrong with Splunk's left join.  Even though ... See more...
It is somewhat confusing what that mvexpand is supposed to do and why string merge is necessary.  As I last commented in your other post, there is nothing wrong with Splunk's left join.  Even though I want to avoid join in general, join is better than doing all that extra work.  Here is my emulation:   | makeresults format=csv data="ip_address, host 10.1.1.1, host1 10.1.1.2, host2 10.1.1.3, host3 10.1.1.4, host4 10.1.1.5, host5 10.1.1.6, host6 10.1.1.7, host7" | rename ip_address as ip | join max=0 type=left ip [makeresults format=csv data="ip, risk, score, contact 10.1.1.1, riskA, 6, , 10.1.1.1, riskB, 7 , 10.1.1.1, ,, person1, 10.1.1.1, riskC, 6,, 10.1.1.2, ,, person2, 10.1.1.3, riskA, 6, person3, 10.1.1.3, riskE, 7, person3, 10.1.1.4, riskF, 8, person4, 10.1.1.8, riskA, 6, person8, 10.1.1.9, riskB, 7, person9"] | table ip, host, risk, score, contact   The output is ip host risk score contact 10.1.1.1 host1 riskA 6   10.1.1.1 host1 riskB 7   10.1.1.1 host1     person1 10.1.1.1 host1 riskC 6   10.1.1.2 host2     person2 10.1.1.3 host3 riskA 6 person3 10.1.1.3 host3 riskE 7 person3 10.1.1.4 host4 riskF 8 person4 10.1.1.5 host5       10.1.1.6 host6       10.1.1.7 host7       Hope this helps. (And thanks for posting data emulation.  That makes things easier.)
For example <html> <button data-token-json="{&quot;my_token&quot;:&quot;My Value&quot;">Set the my_token token to My Value</button> </html> and you can then use the $my_token$ e... See more...
For example <html> <button data-token-json="{&quot;my_token&quot;:&quot;My Value&quot;">Set the my_token token to My Value</button> </html> and you can then use the $my_token$ elsewhere in your dashboard.
It's a really useful piece of JS that allows you to put HTML buttons into your dashboard that can set and unset tokens in use in the dashboard. 
Hi @elend , your two searches are completely different, so it's normal to have different results. probably in the additional fields that you usend in the second search, there's some empty value, so... See more...
Hi @elend , your two searches are completely different, so it's normal to have different results. probably in the additional fields that you usend in the second search, there's some empty value, so for this reason the related results are discarded in the second search results. In other words, you cannot compare these two searches. to really compare them, you should modify the DataModel rules adding a calculated field that when there's an empty value for each field, it adds e fixed value (e.g.: "unknown"), as you can find for the user field in the authentication data model. Ciao. Giuseppe
Hi @dbroggy , are you speaking of Security Correlation Searches or what else? is Correlation Searches, install the Splunk Security Essentials App (https://splunkbase.splunk.com/app/3435  there's ... See more...
Hi @dbroggy , are you speaking of Security Correlation Searches or what else? is Correlation Searches, install the Splunk Security Essentials App (https://splunkbase.splunk.com/app/3435  there's a very comprehensive list of Correlation Searches, and it permit also an analysis of your data to understand which of them are applicable to your data and givie you also a test set of data to see these Correlation Searches in action. Ciao. Giuseppe