All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I Will try to use add-on Simple SNMP Getter but the response like the picture , please advice   this is the input    
This seems confusing, as Splunk hasn't attempted to do the mongodb upgrade yet, I would expect it to fail after the upgrade if this was the case?   Edit: I ran HWinfo on the box, its showing AVX, A... See more...
This seems confusing, as Splunk hasn't attempted to do the mongodb upgrade yet, I would expect it to fail after the upgrade if this was the case?   Edit: I ran HWinfo on the box, its showing AVX, AVX2 and AVX-512 supported, so I don't think this is the issue.
Thats incorrect, its a server 2022 box.
Hi folks I am trying to schedule a dashboard to send mail with its details periodically, I can able to do it but the output is not that legible i.e the attachment either .pdf or .pmg format is blurr... See more...
Hi folks I am trying to schedule a dashboard to send mail with its details periodically, I can able to do it but the output is not that legible i.e the attachment either .pdf or .pmg format is blurry when zoomed in at slightest to view the readable things like table data, graph points and so on. the Dashboard contains multiple panel like table,bar graph, pie chat and different kinds of graph as well. does increasing font size of the data helps? I dont want to alter the dashboard and make a mess unless it will surely solve the matter.  I am not very good at dashboarding and new to Splunk as well.  Please advise 
Sure.Thank you.
Hi @Miguel3393 , at first, don't use the search command after the main search because your search is slower! Then, you already calculated the difference in seconds in the field diffTime, you have o... See more...
Hi @Miguel3393 , at first, don't use the search command after the main search because your search is slower! Then, you already calculated the difference in seconds in the field diffTime, you have only to add this field to the table command. then, I'm not sure that the solution to extress the duration in minutes and seconds is correct, you shuld use: | eval Duracion=tostring(diffTime,"duration")  In other words, please try this: index="cdr" ("Call.TermParty.TrunkGroup.TrunkGroupId"="2811" OR "Call.TermParty.TrunkGroup.TrunkGroupId"="2810") "Call.ConnectTime"=* "Call.DisconnectTime"=* | lookup Pais Call.RoutingInfo.DestAddr OUTPUT Countrie | eval Disctime=strftime('Call.DisconnectTime'/1000,"%m/%d/%Y %H:%M:%S %Q") | eval Conntime=strftime('Call.ConnectTime'/1000, "%m/%d/%Y %H:%M:%S%Q") | eval diffTime=('Call.DisconnectTime'-'Call.ConnectTime') | eval Duracion=tostring(diffTime,"duration") | table Countrie Duracion diffTime Ciao. Giuseppe
Hi  @anu1 ,the dashboard is very easy, but it requires a preparation that depends on the number of data sources that you want to display in this dashboard. In few words, you should: analyze your ... See more...
Hi  @anu1 ,the dashboard is very easy, but it requires a preparation that depends on the number of data sources that you want to display in this dashboard. In few words, you should: analyze your data sources and define the conditions for LOGIN, LOGOUT and LOGFAIL, eg, for Windows login is EventCode=4624, logout is EventCode=4634 and logfail is EventCode=4625, then create av eventtype for each condition assigning a tag (LOGIN, LOGOUT or LOGFAIL) to each eventtype, create some alias to have the same field names for the fields to display (e.g. UserName, IP_Source,  Hostname, etc...) create a dashboard running a search like the following: tag=$tag$ host=$host$ UserName=$user$ | table _time tag HostName UserName IP_Source the three tags in the main search come from three inputs. Let me know if you need help to create the dashboard that's very easy. Ciao. Giuseppe  
@MikeMakai Please run `tcpdump` to verify if the expected logs are being received. If the expected output is observed, we can proceed to check from the Splunk side. If this reply helps you, Karma w... See more...
@MikeMakai Please run `tcpdump` to verify if the expected logs are being received. If the expected output is observed, we can proceed to check from the Splunk side. If this reply helps you, Karma would be appreciated.
@MikeMakai Could you share your `inputs.conf` file? Are you sending data directly from the FMC to Splunk, or is there an intermediate forwarder between your FMC and Splunk?
Please do not use screenshot to illustrate text data.  Use text table or text box.  But even the two index search screenshots are inconsistent, meaning there is no common dest_ip.  You cannot expect ... See more...
Please do not use screenshot to illustrate text data.  Use text table or text box.  But even the two index search screenshots are inconsistent, meaning there is no common dest_ip.  You cannot expect all fields to be populated when there is no matching field value.  This is basic mathematics. Like @bowesmana says, find a small number of events that you know have matching dest_ip in both indices, manually calculate what the output should be, then use the proposed searches on this small dataset. Here is a mock dataset losely based on your screenshots but WITH matching dest_ip src_zone src_ip dest_zone dest_ip transport dest_port app rule action session_end_reason packets_out packets_in src_translated_ip dvc_name index server_name ssl_cipher ssl_version trusted 10.80.110.8 untrusted 152.88.1.76 UDP 53 dns_base whatever1 blocked policy_deny 1 0 whateverNAT don'tmatter *firewall*             152.88.1.76                     *corelight* whatever2 idon'tcare TLSv3 The first row is from index=*firewall*, the second from *corelight*. Because your two searches operators on different indices, @gcusello 's search can also be expressed with append (as opposed to OR) without much penalty.  Like this   index="*firewall*" sourcetype=*traffic* src_ip=10.0.0.0/8 | append [search index=*corelight* sourcetype=*corelight* server_name=*microsoft.com*] | fields src_zone, src_ip, dest_zone, dest_ip, server_name, transport, dest_port, app, rule, action, session_end_reason, packets_out, packets_in, src_translated_ip, dvc_name | stats values(*) AS * BY dest_ip | rename src_zone AS From, src_ip AS Source, dest_zone AS To, dest_ip AS Destination, server_name AS SNI, transport AS Protocol, dest_port AS Port, app AS "Application", rule AS "Rule", action AS "Action", session_end_reason AS "End Reason", packets_out AS "Packets Out", packets_in AS "Packets In", src_translated_ip AS "Egress IP", dvc_name AS "DC"   Using the mock dataset, the output is Destination Action Application DC Egress IP End Reason From Packets In Packets Out Port Protocol Rule SNI Source To 152.88.1.76 blocked dns_base don'tmatter whateverNAT policy_deny trusted 0 1 53 UDP whatever1 whatever2 10.80.110.8 untrusted This is a full emulation for you to play with and compare with real data   | makeresults format=csv data="src_zone, src_ip, dest_zone, dest_ip, transport, dest_port, app, rule, action, session_end_reason, packets_out, packets_in, src_translated_ip, dvc_name trusted, 10.80.110.8, untrusted, 152.88.1.76, UDP, 53, dns_base, whatever1, blocked, policy_deny, 1, 0, whateverNAT, don'tmatter" | table src_zone, src_ip, dest_zone, dest_ip, transport, dest_port, app, rule, action, session_end_reason, packets_out, packets_in, src_translated_ip, dvc_name | eval index="*firewall*" ``` the above emulates index="*firewall*" sourcetype=*traffic* src_ip=10.0.0.0/8 ``` | append [makeresults format=csv data="server_name, dest_ip, ssl_version, ssl_cipher whatever2, 152.88.1.76, TLSv3, idon'tcare" | eval index="*corelight*" ``` the above emulates index=*corelight* sourcetype=*corelight* server_name=*microsoft.com* ```] | fields src_zone, src_ip, dest_zone, dest_ip, server_name, transport, dest_port, app, rule, action, session_end_reason, packets_out, packets_in, src_translated_ip, dvc_name | stats values(*) AS * BY dest_ip | rename src_zone AS From, src_ip AS Source, dest_zone AS To, dest_ip AS Destination, server_name AS SNI, transport AS Protocol, dest_port AS Port, app AS "Application", rule AS "Rule", action AS "Action", session_end_reason AS "End Reason", packets_out AS "Packets Out", packets_in AS "Packets In", src_translated_ip AS "Egress IP", dvc_name AS "DC"    
Please share the search so far and some sample data then we might be able to help you with the search query.
Hey team, I have one requirement i.e have to Create a splunk dashboard to report the # of Logins , # of Logouts The input for the Splunk report should be as follows :  Input dropdown - Time Picker... See more...
Hey team, I have one requirement i.e have to Create a splunk dashboard to report the # of Logins , # of Logouts The input for the Splunk report should be as follows :  Input dropdown - Time Picker, Customer, Host Name Either identify using probe data or Splunk Command metrics Output for the following metrics should be shown as a timegraph with # of logins, logouts , the graph should consists of time,which host and which customer we are using.and the query also should have the tokens when i ran the query can you give me the search query for this requirement.I used multiple queries but am not getting the exact data. Can you help me with the query.Thanks.
@richgalloway The perfect solution, exactly what I was looking for. Thank you
Thanks Kiran
I am sending syslog data to Splunk from Cisco FMC. I am using UCAPL compliance and therefore cannot use eStreamer. The data is being ingested into Splunk and the dashboard is showing some basic event... See more...
I am sending syslog data to Splunk from Cisco FMC. I am using UCAPL compliance and therefore cannot use eStreamer. The data is being ingested into Splunk and the dashboard is showing some basic events, like connection events, volume file events and malware events. When I try to learn more about these events it doesn't drill down into more info. For example, when I click on the 14 Malware Events and chose open in search it just shows the number of events. There is no information regarding these events. When I click on inspect, it shows command.tstats at13 and  command.tstats.execute_output at 1. It doesn't provide further clarity regarding the malware events. When I view the Malware files dashboard on the FMC, is shows no data for malware threats. So based on the FMC it seems that the data in the Splunk dashboard is incorrect or at least interpreting malware events differently from the FMC dashboard. 
Example 1 - you are using dedup src/dest - you can't do that as I explained in my other post. Example 2 - dedup here is not useful - you have a multivalue src_ip and you will not have any duplicate ... See more...
Example 1 - you are using dedup src/dest - you can't do that as I explained in my other post. Example 2 - dedup here is not useful - you have a multivalue src_ip and you will not have any duplicate src_ip in there relating to the dest_ip, so it's redundant. Best way to work out what's wrong here is to remove the last 2 lines and just let the stats work. If you work on a small time zone where you KNOW what data you expect - then you can more easily validate what's wrong. As @isoutamo says, you can even remove stats and just to table * - but work with a very small data set where the results are predictable. Then you can build back the detail again.
It's quotes - you are using * without quotes, so it's invalid eval - sometimes it's hard to debug eval token setters in dashboards, but generally if you find that nothing appears to happen when you h... See more...
It's quotes - you are using * without quotes, so it's invalid eval - sometimes it's hard to debug eval token setters in dashboards, but generally if you find that nothing appears to happen when you have an <eval> token setter, it's probably getting an error. I often have a panel behind depends="$debug_tokens$" that has an html panel that reports token values, which I can turn on as needed. Also on the dashboard examples app there is a showtokens.js which can help you uncover token issues.  
https://community.splunk.com/t5/Splunk-Enterprise/Migration-of-Splunk-to-different-server-same-platform-Linux-but/m-p/538062 this links contains those exact steps which are needed including remove old... See more...
https://community.splunk.com/t5/Splunk-Enterprise/Migration-of-Splunk-to-different-server-same-platform-Linux-but/m-p/538062 this links contains those exact steps which are needed including remove old peers from CM! As "splunk offline --enforce-counts" is not enough.
Hi gcusello, Thanks for your response and suggestions. 1- As yuanliu  said, please see samples in text format (using the Insert/Edit Code Sample button). 2- I put all the search terms in the main ... See more...
Hi gcusello, Thanks for your response and suggestions. 1- As yuanliu  said, please see samples in text format (using the Insert/Edit Code Sample button). 2- I put all the search terms in the main search like you said as below:   index=nprod_database sourcetype=tigergraph:app:auditlog:8542 host=VCAUSC11EUAT* clientHost failedAttempts userAgent userName authType message timestamp actionName status "actionName":"login" "message":"Authentication failed" "status":"FAILURE" "timestamp":"2025-01-08T*"   Sample results:   {"endpoint":"/requesttoken","clientHost":"localhost:36976","userAgent":"Apache-HttpClient/5.2.3 (Java/17.0.13)","userName":"TEST6","authType":"LDAP","message":"Generate new token successfully.\nWarning: TEST6 Support cannot restore access to secrets/tokens for security reasons. Please save your secret/token and keep it safe and accessible.","timestamp":"2024-12-30T11:27:02.881-06:00","actionName":"requestToken","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"localhost:53964","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST1","authType":"LDAP","message":"login succeed","timestamp":"2024-12-30T15:47:15.496-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/schema","clientHost":"localhost:54556","userAgent":"GraphStudio","userName":"TEST1","authType":"LDAP","message":"Successfully got schema for graph aml_risk_graph.","timestamp":"2024-12-30T15:47:33.226-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/authtoken","clientHost":"localhost:54552","userAgent":"GraphStudio","userName":"TEST1","authType":"LDAP","message":"Generate new token successfully.\nWarning: TEST6 Support cannot restore access to secrets/tokens for security reasons. Please save your secret/token and keep it safe and accessible.","timestamp":"2024-12-30T15:47:35.36-06:00","actionName":"requestToken","status":"SUCCESS"}, {"endpoint":"/gsql/userdefinedtokenfunctions","clientHost":"localhost:54556","userAgent":"GraphStudio","userName":"TEST1","authType":"LDAP","message":"getUDFs succeeded","timestamp":"2024-12-30T15:47:38.872-06:00","actionName":"getUDFs","status":"SUCCESS"}, {"endpoint":"/gsql/schema","clientHost":"100.71.65.228","userAgent":"GraphStudio","userName":"TEST1","authType":"LDAP","message":"Successfully got schema for graph aml_risk_graph.","timestamp":"2024-12-30T15:47:39.214-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/file","clientHost":"100.71.65.229","userAgent":"GraphStudio","userName":"TEST1","authType":"LDAP","message":"exportJob succeeded","timestamp":"2024-12-30T15:47:39.292-06:00","actionName":"exportJob","status":"SUCCESS"}, {"endpoint":"/gsql/authtoken","clientHost":"localhost:54552","userAgent":"GraphStudio","userName":"TEST1","authType":"LDAP","message":"Generate new token successfully.\nWarning: TEST6 Support cannot restore access to secrets/tokens for security reasons. Please save your secret/token and keep it safe and accessible.","timestamp":"2024-12-30T15:47:40.63-06:00","actionName":"requestToken","status":"SUCCESS"}, {"endpoint":"/gsql/schema","clientHost":"localhost:54556","userAgent":"GraphStudio","userName":"TEST1","authType":"LDAP","message":"Successfully got schema for graph aml_risk_graph.","timestamp":"2024-12-30T15:47:41.877-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"100.71.65.229","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST2","authType":"LDAP","message":"login succeed","timestamp":"2024-12-31T10:00:19.455-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"100.71.65.229","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST2","authType":"LDAP","message":"login succeed","timestamp":"2024-12-31T10:03:22.203-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"localhost:47404","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST3","authType":"LDAP","message":"login succeed","timestamp":"2024-12-31T10:18:22.9-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"100.71.65.229","failedAttempts":0,"userAgent":"GrpahStudio","userName":"TEST3","authType":"LDAP","message":"Authentication failed!","timestamp":"2024-12-31T10:25:32.26-06:00","actionName":"login","status":"FAILURE"}, {"endpoint":"/requesttoken","clientHost":"localhost:35260","userAgent":"Apache-HttpClient/5.2.3 (Java/17.0.13)","userName":"TEST6","authType":"LDAP","message":"Generate new token successfully.\nWarning: TEST6 Support cannot restore access to secrets/tokens for security reasons. Please save your secret/token and keep it safe and accessible.","timestamp":"2024-12-31T11:00:05.35-06:00","actionName":"requestToken","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"100.71.65.229","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST3","authType":"LDAP","message":"login succeed","timestamp":"2024-12-31T11:24:31.435-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/schema","clientHost":"localhost:47318","userAgent":"Apache-HttpClient/5.2.3 (Java/17.0.13)","userName":"c089265","authType":"LDAP","message":"Successfully got schema for graph aml_risk_graph.","timestamp":"2024-12-31T21:15:55.995-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/schema","clientHost":"localhost:38336","userAgent":"Apache-HttpClient/5.2.3 (Java/17.0.13)","userName":"c089265","authType":"LDAP","message":"Successfully got schema for graph aml_risk_graph.","timestamp":"2025-01-02T11:36:45.844-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/schema","clientHost":"100.82.128.85","userAgent":"Apache-HttpClient/5.2.3 (Java/17.0.13)","userName":"c089265","authType":"LDAP","message":"Successfully got schema for graph aml_risk_graph.","timestamp":"2025-01-03T03:59:10.235-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"localhost:38012","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST4","authType":"LDAP","message":"login succeed","timestamp":"2025-01-06T13:47:43.429-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"100.71.65.229","failedAttempts":0,"userAgent":"GrpahStudio","userName":"TEST4","authType":"LDAP","message":"Authentication failed!","timestamp":"2025-01-06T13:48:27.717-06:00","actionName":"login","status":"FAILURE"}, {"endpoint":"/gsql/simpleauth","clientHost":"100.71.65.228","failedAttempts":0,"userAgent":"GrpahStudio","userName":"TEST4","authType":"LDAP","message":"Authentication failed!","timestamp":"2025-01-06T13:48:32.587-06:00","actionName":"login","status":"FAILURE"}, {"endpoint":"/gsql/simpleauth","clientHost":"/127.0.0.1:43520","failedAttempts":0,"userAgent":"GrpahStudio","userName":"TEST4","authType":"LDAP","message":"Authentication failed!","timestamp":"2025-01-06T13:48:36.03-06:00","actionName":"login","status":"FAILURE"}, {"endpoint":"/gsql/simpleauth","clientHost":"localhost:60404","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST5","authType":"LDAP","message":"login succeed","timestamp":"2025-01-06T19:59:28.295-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"100.71.65.229","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST5","authType":"LDAP","message":"login succeed","timestamp":"2025-01-06T19:59:40.885-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/login","clientHost":"localhost:53886","clientOSUsername":"TEST4","failedAttempts":0,"userAgent":"GSQL Shell","userName":"TEST4","authType":"LDAP","message":"login succeeded","timestamp":"2025-01-06T20:45:36.492-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/file","clientHost":"10.138.170.165","clientOSUsername":"TEST4","userAgent":"GSQL Shell","userName":"TEST4","authType":"LDAP","message":"showCatalog succeeded","timestamp":"2025-01-06T20:45:37.241-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/login","clientHost":"localhost:39154","clientOSUsername":"TEST4","failedAttempts":0,"userAgent":"GSQL Shell","userName":"TEST4","authType":"LDAP","message":"login succeeded","timestamp":"2025-01-06T20:46:48.666-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/file","clientHost":"10.138.170.165","clientOSUsername":"TEST4","userAgent":"GSQL Shell","userName":"TEST4","authType":"LDAP","message":"showCatalog succeeded","timestamp":"2025-01-06T20:46:49.376-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/login","clientHost":"10.138.170.165","clientOSUsername":"TEST4","failedAttempts":0,"userAgent":"GSQL Shell","userName":"TEST4","authType":"LDAP","message":"login succeeded","timestamp":"2025-01-06T20:47:14.033-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/file","clientHost":"localhost:39154","clientOSUsername":"TEST4","userAgent":"GSQL Shell","userName":"TEST4","authType":"LDAP","message":"Successfully used graph 'aml_risk_graph'.","timestamp":"2025-01-06T20:47:14.863-06:00","actionName":"useGraph","status":"SUCCESS"}, {"endpoint":"/gsql/file","clientHost":"localhost:39154","clientOSUsername":"TEST4","userAgent":"GSQL Shell","userName":"TEST4","authType":"LDAP","message":"Successfully created query 'Del_orphan_edges_for_previous_primary'.","timestamp":"2025-01-06T20:47:17.079-06:00","actionName":"createQuery","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"100.71.65.228","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST4","authType":"LDAP","message":"login succeed","timestamp":"2025-01-06T20:55:21.895-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/schema","clientHost":"localhost:43048","userAgent":"GraphStudio","userName":"TEST4","authType":"LDAP","message":"Successfully got schema for graph aml_risk_graph.","timestamp":"2025-01-06T20:56:34.057-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/queries","clientHost":"localhost:43048","userAgent":"GraphStudio","userName":"TEST4","authType":"LDAP","message":"showQuery succeeded","timestamp":"2025-01-06T20:56:34.781-06:00","actionName":"showQuery","status":"SUCCESS"}, {"endpoint":"/gsql/file","clientHost":"localhost:39154","clientOSUsername":"TEST4","userAgent":"GSQL Shell","userName":"TEST4","authType":"LDAP","message":"Successfully installed query [Del_orphan_edges_for_previous_primary].","timestamp":"2025-01-06T20:57:36.229-06:00","actionName":"installQuery","status":"SUCCESS"}, {"endpoint":"/gsql/schema","clientHost":"100.71.65.229","userAgent":"GraphStudio","userName":"TEST4","authType":"LDAP","message":"Successfully got schema for graph aml_risk_graph.","timestamp":"2025-01-06T20:57:46.93-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/queries","clientHost":"localhost:60138","userAgent":"GraphStudio","userName":"TEST4","authType":"LDAP","message":"showQuery succeeded","timestamp":"2025-01-06T20:57:47.563-06:00","actionName":"showQuery","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"localhost:48050","failedAttempts":0,"userAgent":"GraphStudio","userName":"sxvedag","authType":"LDAP","message":"login succeed","timestamp":"2025-01-07T08:56:29.476-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/gsql/schema","clientHost":"localhost:48036","userAgent":"GraphStudio","userName":"sxvedag","authType":"LDAP","message":"Successfully got schema for graph aml_risk_graph.","timestamp":"2025-01-07T08:56:38.165-06:00","actionName":"showCatalog","status":"SUCCESS"}, {"endpoint":"/gsql/queries","clientHost":"100.71.65.229","userAgent":"GraphStudio","userName":"sxvedag","authType":"LDAP","message":"showQuery succeeded","timestamp":"2025-01-07T08:56:38.823-06:00","actionName":"showQuery","status":"SUCCESS"}, {"endpoint":"/gsql/simpleauth","clientHost":"100.71.65.228","failedAttempts":0,"userAgent":"GraphStudio","userName":"TEST4","authType":"LDAP","message":"login succeed","timestamp":"2025-01-07T10:16:11.689-06:00","actionName":"login","status":"SUCCESS"}, {"endpoint":"/requesttoken","clientHost":"localhost:45058","userAgent":"Apache-HttpClient/5.2.3 (Java/17.0.13)","userName":"TEST6","authType":"LDAP","message":"Generate new token successfully.\nWarning: TEST6 Support cannot restore access to secrets/tokens for security reasons. Please save your secret/token and keep it safe and accessible.","timestamp":"2025-01-08T09:20:02.381-06:00","actionName":"requestToken","status":"SUCCESS"} ]   But here I don’t know why the filter is giving me information about "status":"SUCCESS" and "message":"login succeed" even my login failed condition is "message":"Authentication failed!" Maybe my condition in the search query is wrong. Also I was trying to get the results for the timestamp but it is showing all for other timestamp. 3- I suppose also  to check the condition for each host in our  infrastructure and each account by using my search below:   index=nprod_database sourcetype=tigergraph:app:auditlog:8542 host=VCAUSC11EUAT* clientHost failedAttempts userAgent userName authType message timestamp actionName status "actionName":"login" "message":"Authentication failed" "status":"FAILURE" "timestamp":"2025-01-08T*"   4- I tried  to use the stats command to aggregate results and the where command to filter them, something like this:   index=nprod_database sourcetype=tigergraph:app:auditlog:8542 host=VCAUSC11EUAT* clientHost failedAttempts userAgent userName authType message timestamp actionName status "actionName":"login" "message":"Authentication failed" "status":"FAILURE" "timestamp":"2025-01-08T*" | stats count BY actionName host | where count>3   Results: No results found. Thank you for your help.
Maybe this can be solution to your challenge: https://community.splunk.com/t5/Deployment-Architecture/KVStore-does-not-start-when-running-Splunk-9-4/m-p/708304/thread-id/29016/highlight/false#M29017