All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

So I have a search that queries hosts that are reporting their syslogs via the Meta Hoot! application for Splunk. As of now the search is only a Single Value, however, I would like to add a trend ind... See more...
So I have a search that queries hosts that are reporting their syslogs via the Meta Hoot! application for Splunk. As of now the search is only a Single Value, however, I would like to add a trend indicator using 'timechart' for the previous 24 hours. Here is the search string. inputlookup meta_woot where index=* sourcetype=syslog | stats dc(host) as "Hosts" How can I incorporate 'timechart' to add the uptick/downtick, trend indicator?
Hi. I created the following search which reports events of Active Directory users being locked aggregated by username:   index="active_directory" sourcetype=XmlWinEventLog source="XmlWinEventLog:S... See more...
Hi. I created the following search which reports events of Active Directory users being locked aggregated by username:   index="active_directory" sourcetype=XmlWinEventLog source="XmlWinEventLog:Security" EventCode=4740 | stats count BY user   To be notified if the overall amount is above a threshold I want to create an alert on it. Of course I could extend this base search to only have a result if the number of events is above the threshold and trigger the alert if the number of results is greater than one:   index="active_directory" sourcetype=XmlWinEventLog source="XmlWinEventLog:Security" EventCode=4740 | stats count BY user | stats sum(count) AS sum | search sum > 100   But in this case the alert result would only consist of the number of events. To get the list of the events one would then need to manually run the base search with correct time range. So I came to the custom trigger condition. As the documentation doesn't tell if it should work, I just tried to use the last two lines as trigger condition:   stats sum(count) AS sum | search sum > 100   Unfortunately this doesn't seem to work. Does anyone have an idea how this could be solved alternatively?
Hello All,                   I have a table view where I need to add the custom cell expansion with + button before the value   If I click on the Add button before the cell values, I need to g... See more...
Hello All,                   I have a table view where I need to add the custom cell expansion with + button before the value   If I click on the Add button before the cell values, I need to get the expansion of all the associated values.  Main search = index= _internal | stats values(sourcetype) by host cell expansion = index=_internal | search cellclickvalue | stats values(Source) by cellclickvalue  
Hi guys, I'm getting this error while trying to configure an scripted input for the app cve_lookup.   What do you think can be the cause of it?
Microsoft Defender ATP (MDATP) events can be sent to a blob storage account or an Event Hub. I was wondering if anyone is collecting MDATP events either way and how the setup was to parse the events?... See more...
Microsoft Defender ATP (MDATP) events can be sent to a blob storage account or an Event Hub. I was wondering if anyone is collecting MDATP events either way and how the setup was to parse the events? Thx
I am looking to visualize the start and end time of events by IP within a very narrow time frame. The attached image show what I imagine the visualization to look like.  I guess this would use the... See more...
I am looking to visualize the start and end time of events by IP within a very narrow time frame. The attached image show what I imagine the visualization to look like.  I guess this would use the horizontal bar chart?   Can one create this type of visualization in Splunk without additional plugins?
Can you please help me with a search to display a list of servers with a status Running or Shutdown? I have a list of hostnames, but I am not sure how to show if the server status is Running or Shu... See more...
Can you please help me with a search to display a list of servers with a status Running or Shutdown? I have a list of hostnames, but I am not sure how to show if the server status is Running or Shutdown. Eventually I have to build a dashboard out of it.
Hello everyone, I am implementing a solution found here: How to pass multivalue tokens to other splunk dashboard URL  This solution uses token play to construct a string using each selected multiva... See more...
Hello everyone, I am implementing a solution found here: How to pass multivalue tokens to other splunk dashboard URL  This solution uses token play to construct a string using each selected multivalue value and then pass it to another dashboard.  I have managed to set up the framework, but it is not working as expected.  The problem in a nutshell is that the & and = characters in the token string are converted into their HTML encoding, %26 and %3D respectively.  This causes the receiving token to interpret it as a single value which of course is invalid. Here is the code for two connected run anywhere dashboards that demonstrate the issue: Sending side:   <form> <label>Multipass</label> <fieldset submitButton="false"> <input type="multiselect" token="tok_input"> <label>Input</label> <choice value="*">All</choice> <choice value="Value_1">Value_1</choice> <choice value="Value_2">Value_2</choice> <default>*</default> <initialValue>*</initialValue> <delimiter> </delimiter> <change> <set token="tok_input_send" delimiter="&amp;form.tok_input_received=">$value$</set> </change> </input> </fieldset> <row> <panel> <table> <search> <query>| makeresults | eval values="$tok_input_send$" | table values</query> <earliest>-24h@h</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">cell</option> <option name="percentagesRow">false</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <drilldown> <link target="_blank">/app/search/multipass_target?form.tok_input_received=$tok_input_send$</link> </drilldown> </table> </panel> </row> </form>     Receiving side:   <form> <label>Multipass Target</label> <fieldset submitButton="false"> <input type="multiselect" token="tok_input_received"> <label>Input</label> <choice value="*">All</choice> <choice value="Value_1">Value_1</choice> <choice value="Value_2">Value_2</choice> <default>*</default> <initialValue>*</initialValue> <delimiter> </delimiter> </input> </fieldset> <row> <panel> <table> <search> <query>| makeresults | eval values="$tok_input_received$" | table values</query> <earliest>-24h@h</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> </form>   You'll see that sending the following: is interpreted like this: Can somebody help to solve this approach, or propose another? Thanks! Andrew
I have a field I am trying to split into new fields and it's not taking. The strings look similar to this- "AV:N/AC:P/PR:X" and I'm trying to extract the vector to equal just the first values (AV:N).... See more...
I have a field I am trying to split into new fields and it's not taking. The strings look similar to this- "AV:N/AC:P/PR:X" and I'm trying to extract the vector to equal just the first values (AV:N). I am trying to extract each part between the slashes (var1= AV:N, var2=AC:P) but am not sure why it's not taking. My props.conf below, any help with the regex or why this may not be working is greatly appreciated!   [sourcetype] EXTRACT-vector = AV:(?<field_trying_to_extract_from>\w+) [sourcetype] Eval-vector = case(vector="AV:N", "Network", vector="AV:A", "Adjacent", vector="AV:L", "Local", vector="AV:P", "Physical")
I have a search that does the following:   | inputlookup system_scores.csv | search "big search goes here" | fields server_org both_server_desktop_score desktop_score server_score   The searc... See more...
I have a search that does the following:   | inputlookup system_scores.csv | search "big search goes here" | fields server_org both_server_desktop_score desktop_score server_score   The search gives me the my custom score for each org and the score for each type of machine (which is what I expect from the search): server_org    both_server_desktop_score     desktop_score        server_score Bob                        60                                                       10                                     40 Alice                      40                                                       10                                     30 Jill                           10                                                        5                                        5 However, I would like to get each column into a row for each of the scores/type of device. My envision the search would look like this: server_org                        server_type                                                         Score Bob                                        both_server_desktop_score                       60 Bob                                        desktop_score                                                  10 Bob                                        server_score                                                      40          Alice                                      both_server_desktop_score                      40 Alice                                      desktop_score                                                  10          Alice                                      server_score                                                      30 Jill                                           both_server_desktop_score                      10 Jill                                           desktop_score                                                   5 Jill                                           server_score                                                       5 Is this possible?  Any ideas?
Hi there, I have 2 questions around the DS behaviour in terms of how it handles apps already on servers. Currently a group of servers is all managed manually, this will change soon. 1. If I have an... See more...
Hi there, I have 2 questions around the DS behaviour in terms of how it handles apps already on servers. Currently a group of servers is all managed manually, this will change soon. 1. If I have an HF that is shared by 2 teams who have put manual apps on there, and want to add a a DS to it to manage the apps for 1 of the teams, will there be issues if 1 of the teams continues to add/edit their apps manually? I realise this is not ideal practice, but it has to be this way for <insert reason here> 2. With regards to the other servers, if i suddenly plug in a DS will it delete all the apps on the servers, if they dont appear in the serverclass i create? Or will it just leave any apps manually put there previously which i can then go in and delete later? This includes splunkbase apps with local config(my understanding is that it will only replace default?) Thanks!
Hi everyone, Introduction: We have Palo Alto products, and we have also installed the appropriate add-on and apps. We mapped the data into the data models and the relevant data model for my questio... See more...
Hi everyone, Introduction: We have Palo Alto products, and we have also installed the appropriate add-on and apps. We mapped the data into the data models and the relevant data model for my question is the Web data model . There is a field called dest in the DM, and for datamodel web (for palo alto) i need its value to be "dest_hostname" (the current value is "dest_ip"). The current value is relevant for the Network Traffic DM, therefore i don't want to change it.  Question:  Is it OK to add additional fields to the built-in data models of Splunk ES, or is it not recommended?  What are the downsides of action like this?    Thanks ! 
We are collecting Wineventlog data from Security, Application & System. In Security we want to disable a particular Event Code which is having the corresponding New_Process_Name.  EventCode=4688  ... See more...
We are collecting Wineventlog data from Security, Application & System. In Security we want to disable a particular Event Code which is having the corresponding New_Process_Name.  EventCode=4688  New_Process_Message=C:\\Program Files (x86)\\Symantec\\Symantec Endpoint Protection Manager\\bin\\xxxx.exe So how can i write the inputs.conf and blacklist the Eventcode with New_Process_Message.    Similarly I have around 30 + New_Process_Message for the EventCode=4688 so how can i blacklist all of them.   Kindly help to provide the inputs.conf for the same.  
Hello Everyone, I have a really simple question but I can'f figure it out for the life of me.  I have a query set up that gives me the utilization of an array, and I want to have a text based field ... See more...
Hello Everyone, I have a really simple question but I can'f figure it out for the life of me.  I have a query set up that gives me the utilization of an array, and I want to have a text based field for its RAG status.  This is what I'm using  | eval RAG=(Class='DB' AND Utilization >= 62, "Red", Utilization >= 50, "Yellow", Utilization < 40, "Green") I've tried to run it and I keep getting the eval statement is malformed error.  Any help you can give would be appreciated. 
Enabled 3 ESCU rules in ES and mapped them in SSE using Content Introspection on the Manage Bookmarks page. After a Refresh Page all 3 rules showed up as "Successfully Implemented" in the main list ... See more...
Enabled 3 ESCU rules in ES and mapped them in SSE using Content Introspection on the Manage Bookmarks page. After a Refresh Page all 3 rules showed up as "Successfully Implemented" in the main list of the Manage Bookmarks page. But not in the top statistics (the big numbers at the top of the page). Then changed the bookmark status to "Needs Tuning" for all 3 rules. The top statistics were not updated. After the next Refresh Page the status of the 3 rules changed back to "Successfully Implemented". Not sure if I missed some steps, did them in the wrong order or this is a bug. Any suggestion? Thank you.
Hi  I have input fields which has value as week number. Based on the Weeknum selected, how do I pass on the earliest and latest date under my drilldown. Here is my input field   <input type="drop... See more...
Hi  I have input fields which has value as week number. Based on the Weeknum selected, how do I pass on the earliest and latest date under my drilldown. Here is my input field   <input type="dropdown" token="weeknum" searchWhenChanged="true">   And here is my drilldown section from one of the dashboard panel where time range gets passed to another page (sre_module_summary) in the name of token selectedearliest & selectedlatest. How to get the values for the token based on the weeknum selected from input panel.   <drilldown target="_blank"> <eval token="Module">$click.value$</eval> <eval token="HostType">$HostType$</eval> <link> <![CDATA[/app/sre/sre_module_summary?form.Module=$Module$&host=$HostType$&form.timerange.earliest=$selectedearliest$&form.timerange.latest=$selectedlatest$]]> </link> </drilldown>   Could someone please help.
Hello, I am looking to see if it is possible to migrate data (around 20TB) from RSA Netwitness MongoDB to Splunk. Would this be possible with DB Connect? Has someone here done a similar migration?... See more...
Hello, I am looking to see if it is possible to migrate data (around 20TB) from RSA Netwitness MongoDB to Splunk. Would this be possible with DB Connect? Has someone here done a similar migration? Thanks for thinking with me! 
Trying to collect my AWS data using on-prem splunk instance. I need to go via a proxy to access anything on the internet sts.amazon.com etc for example. I've added proxy settings to the environment... See more...
Trying to collect my AWS data using on-prem splunk instance. I need to go via a proxy to access anything on the internet sts.amazon.com etc for example. I've added proxy settings to the environment and splunk-launch.conf which seems to do the trick for most other apps and I can get my newrelic data and can see the access via the proxy. For the AWS app if I leave the app specific proxy config blank I can't add any account because it appears to be ignore the global proxy settings and trying to go direct when I click add. If I configure the App specific proxy with the same settings as the system (except for the no proxy list because there is no setting for that) I can't even get to the account add screen because it is trying to proxy the call to the localhost:8089 mgt port. Anyone know how to configure it to make it work?   Thanks
I want to customise the search using input passed in the dashboard. The field in the logs have value  CLASS="/x/y/z/abc/query/v1 Need quick Response" The input value passed from Dashboard is x-y-z-... See more...
I want to customise the search using input passed in the dashboard. The field in the logs have value  CLASS="/x/y/z/abc/query/v1 Need quick Response" The input value passed from Dashboard is x-y-z-abc-query-v1. We want to search the logs using input parameter in the field CLASS containing value /x/y/z/abc/query/v1. as the input parameter value format is x-y-z-abc-query-v1 and the value to search is /x/y/z/abc/query/v1, how  should the query look like to give result. Query at the end should look like  index= yyy CLASS="/x/y/z/abc/query/v1*" but how to tackle input parameter looks like /x/y/z/abc/query/v1 and search in the Class Field value. Please assist.  
Hello All, I have a question. I am getting the below warning on all the panels of my dashboard. Also, the search is taking a lot of time to execute. Configuration initialization for /opt/splunk/etc... See more...
Hello All, I have a question. I am getting the below warning on all the panels of my dashboard. Also, the search is taking a lot of time to execute. Configuration initialization for /opt/splunk/etc took longer than expected (1953ms) when dispatching a search with search ID _bWFkaHVyaS5tYXlhbmtAcm9qb2NvbnN1bHRhbmN5LmNvbQ_bWFkaHVyaS5tYXlhbmtAcm9qb2NvbnN1bHRhbmN5LmNvbQ__search__search1_1598863806.484277. This usually indicates problems with underlying storage performance. I checked with the disk usage and CPU utilization but that is not an issue here.  I am on version Splunk 7.2.4 (build 8a94541dcfac) Can you please help me to troubleshoot this and how can I get this solved. Thanks, Madhuri