All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

As your current inputs is set for scraping all the logs from the folder D:\logs and then you are sending various events from the those logs to null and now you want to be more selective in terms of o... See more...
As your current inputs is set for scraping all the logs from the folder D:\logs and then you are sending various events from the those logs to null and now you want to be more selective in terms of one log file that you want for info level information and still keep the others from sending some type of events, this becomes a little tricky without testing and having a tinker. Some options that may work: Option 1 You might want to move that log (jkl.txt) to another folder or a sub folder and monitor it separately with another monitor, props and transforms so you can control it, this would leave the other's where they are and you can ingest this one now and filter on it as well. Option 2 Rework your current props and transforms - you may be able to set by source in props, do this for all your other logs and send them to null, either way this all needs some level config and testing out.   [source::...my_otherlog.txt] TRANSFORMS-my_otherlog = my_otherlog_file_null
Hello. I have tried different combination of replicationDenyList stanza definition, in all cases it did not work. with quotes, "apps\TA-microsoft-graph-security-add-on-for-splunk\bin\...", without ... See more...
Hello. I have tried different combination of replicationDenyList stanza definition, in all cases it did not work. with quotes, "apps\TA-microsoft-graph-security-add-on-for-splunk\bin\...", without quotes apps\TA-microsoft-graph-security-add-on-for-splunk\bin\... , with * "apps\TA-microsoft-graph-security-add-on-for-splunk\bin\*", with full path D:\Splunk Search Head\etc\apps\TA-microsoft-graph-security-add-on-for-splunk\bin\*, and combinations of them. But nothing, I always got the error:  Invalid key in stanza [replicationDenyList] in D:\Splunk Search Head\etc\system\local\distsearch.conf, line 29: MSbin (value: apps\TA-microsoft-graph-security-add-on-for-splunk\bin\*). Do you have a working example of this stanza? Thanks for your help.
Hi , I came across many queries to calculate daily ingest per index for last 7 days but I am not getting the expected results.   Can you please guide me with the query to calculate the daily ingest... See more...
Hi , I came across many queries to calculate daily ingest per index for last 7 days but I am not getting the expected results.   Can you please guide me with the query to calculate the daily ingest per index in GB for last 7 days?
Try cutting it down so that it remains valid and representative and then paste it here.
You have not shown anything that indicates that the search has the value you are seeking on the first row of your results. Please share your search and follow @bowesmana's suggestion about which toke... See more...
You have not shown anything that indicates that the search has the value you are seeking on the first row of your results. Please share your search and follow @bowesmana's suggestion about which token to use to retrieve the results.
the result coming is Panels Blacklisted Software Exceptions Clients missing critical updates Clients with blacklisted Software Clients with old Defender patterns Critical severity vulnerabilities... See more...
the result coming is Panels Blacklisted Software Exceptions Clients missing critical updates Clients with blacklisted Software Clients with old Defender patterns Critical severity vulnerabilities Defender enrollment status High severity vulnerabilities Local virtual machines Outdated operating systems - Endpoint Outdated operating systems - Unknown Outdated operation systems - Server Servers with blacklisted Software Systems not found in patch management database Total Installed blacklisted Software Vulnerabilities solved but I want all the result  in different section of table
is there table virualization in splunk
Hi danspav, thank you so much, the query took around 300 sec. on around 10 indexes, 4TB db size and returns what i'm looking for, perfect!
You can also add this on the end of that previous post which will make the column name the value of the panel and the value of the column=1 | foreach row* [ eval {<<FIELD>>}=1 ] | fields - row*
You can do this | inputlookup panels.csv | transpose 0 what do you want the column headings to be? That will give you columns called row 1, row 2, row 3 and so on with the values found.  
Building on @gcusello approach, it can be done more efficiently by not using mvexpand and just filtering out the ones that do not match the max value. index=abc host IN () | eval col=_time."|".respo... See more...
Building on @gcusello approach, it can be done more efficiently by not using mvexpand and just filtering out the ones that do not match the max value. index=abc host IN () | eval col=_time."|".response_time | stats max(response_time) AS max_response_time values(col) AS col BY URL | eval times=mvmap(col, if(match(col, "\|".max_response_time."$"), mvindex(split(col, "|"), 0), null())) | fields URL max_response_time times | eval times=strftime(times, "%F %T.%Q") | rename max_response_time as "Maximum Response Time" | sort - "Maximum Response Time" Note that if you have LOTS of values and lots of URLs you may get a spike in memory usage retaining all the values. Note this also handles the situation where the max response time occurs in more than one time. You can also do this with eventstats index=abc host IN () | fields _time response_time URL | eventstats max(response_time) AS max_response_time by URL | where response_time=max_response_time | stats values(max_response_time) AS "Maximum Response Time" values(_time) as times BY URL | eval times=strftime(times, "%F %T.%Q") | sort - "Maximum Response Time" Check which will perform better with your data - eventstats can be slow if crunching lots of data. You can see an example of how this works using either of the techniques above by replacing index=abc... with this, which will give you some simulated data | makeresults count=1000 | streamstats c | eval _time=now() - c | eval response_time=random() % 1000 | eval URL=mvindex(split("URL1,URL2,URL3,URL4",","), random() % 4) | fields - c
Thanks for the quick response  i want it horizontally which it is showing thanks to you , but i want to display all the content in a table can we do that
Try this | inputlookup your_lookup.csv | stats values(panels) as panels | eval panels=mvjoin(panels, " ")
@anooshacno it's not, you need to look at the $result.has_runtime$ token - see my example 
I want to show lookup file content horizontally. eg:- rather than this panels a b c I want panels a b c    OR         a b c
I get this error if i add in different row as you mentioned above   @ITWhisperer Any suggestions pls?
Hi @Devi13, please try this: index=abc host IN () | eval col=_time."|".response_time | stats max(response_time) AS "Maximum Response Time" values(col) AS col BY URL | mvexpand col | r... See more...
Hi @Devi13, please try this: index=abc host IN () | eval col=_time."|".response_time | stats max(response_time) AS "Maximum Response Time" values(col) AS col BY URL | mvexpand col | rex field=col "^(?<_time>[^\|]+)\|(?<response_time>[.+)" | where "Maximum Response Time"=response_time | table URL "Maximum Response Time" _time | sort - "Maximum Response Time" Maybe it's also possible using eval in the stats command. Ciao. Giuseppe
Hello @gcusello , Thank you for your response. I am looking for something like below,   URL Maximum Response Time Time at which the maximum response got hit 1 abc.com 22.346 2024-04-24 00:0... See more...
Hello @gcusello , Thank you for your response. I am looking for something like below,   URL Maximum Response Time Time at which the maximum response got hit 1 abc.com 22.346 2024-04-24 00:00:25   so at 2024-04-24 00:00:25 this time, the url abc.com got the maximum response time I need to append the time which corresponds to the maximum response time of the url
Hi @Devi13, what do you mean with "respective time"? if you're meaning time info about the search (es. min_time, max_time, search execution_time, etc...) you could add "| addinfo" at the end of you... See more...
Hi @Devi13, what do you mean with "respective time"? if you're meaning time info about the search (es. min_time, max_time, search execution_time, etc...) you could add "| addinfo" at the end of your search and choose the info you want. for more infos see at https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Addinfo Ciao. Giuseppe
index=abc host IN () | stats max(response_time) as "Maximum Response Time" by URL | sort - "Maximum Response Time" I need to add the respective time for the maximum response time along with the st... See more...
index=abc host IN () | stats max(response_time) as "Maximum Response Time" by URL | sort - "Maximum Response Time" I need to add the respective time for the maximum response time along with the stats. Coud you please help