All Topics

Top

All Topics

Hi Team, Is there any way we can calculate time duration between 2 different events like start and end. For example: we have start event at 10/10/23 23:50:00.031 PM, and End evet at 11/10/23 0... See more...
Hi Team, Is there any way we can calculate time duration between 2 different events like start and end. For example: we have start event at 10/10/23 23:50:00.031 PM, and End evet at 11/10/23 00:50:00.031 AM  how can we calculate this. please help. Thank you
How to calculate total when aggregating using stats max(field)? Thank you for your help.  Max Total Score is the total score of maximum score for each Score field when aggregating all rows using ... See more...
How to calculate total when aggregating using stats max(field)? Thank you for your help.  Max Total Score is the total score of maximum score for each Score field when aggregating all rows using stats: max(Score1), max(Score2), max(Score3).    TotalScore is the total of each Score field for each row (without aggregation)  This is the output I need Class Name Subject TotalScore Score1 Score2   Score3 Max TotalScore ClassA grouped grouped 240 85 95 80 260 My Splunk Search   | index=scoreindex | stats values(Name) as Name, values(Subject) as Subject, max(TotalScore) as TotalScore, max(Score1) as Score1, max(Score2) as Score2, max(Score3) as Score3 by Class | table Class Name, Subject, Total Score, Score1, Score2, Score3   I think my search below is going to display the following. Class Name Subject TotalScore Score1 Score2   Score3 ClassA Name1 Name2 Name3 Math English 240 85 95 80 This is the whole data in table format from scoreindex Class Name Subject TotalScore Score1 Score2   Score3 ClassA Name1 Math 170 60 40 70 ClassA Name1 English 195 85 60 50 ClassA Name2 Math 175 50 60 65 ClassA Name2 English 240 80 90 70 ClassA Name3 Math 170 40 60 70 ClassA Name3 English 230 55 95 80
at all, I have to parse logs extracted from logstash. I'm receiving logstash logs and they are in json format and almost all the fields I need are already parsed and available in json. My issue is... See more...
at all, I have to parse logs extracted from logstash. I'm receiving logstash logs and they are in json format and almost all the fields I need are already parsed and available in json. My issue is that the event rawdata is in a field called "message" and these fields aren't automatically extracted as I would. I'd like to avoid to re-parse all datasources and create custom add-ons from all data sources. Does anybody encounter this kind of integration and know a way to use standard Add-Ons to parse only the message field? Thank you for your help. Ciao. Giuseppe
We use the ansible-role-for-splunk framework found on GitHub: https://github.com/schneewe/ansible-role-for-splunk It support app deployments through the following task: https://github.com/schneewe/a... See more...
We use the ansible-role-for-splunk framework found on GitHub: https://github.com/schneewe/ansible-role-for-splunk It support app deployments through the following task: https://github.com/schneewe/ansible-role-for-splunk/blob/master/roles/splunk/tasks/configure_apps.yml But this seem to require a full Search Head Cluster but we only have a single search head node. Isn't the single search head setup supported by this framework or am I just missing something?
Hi I am using the same source type on the same file. One is coming in via forwarder and the other is uploaded via GUI. However, the forwarder is not extracting the fields. This means I have to us... See more...
Hi I am using the same source type on the same file. One is coming in via forwarder and the other is uploaded via GUI. However, the forwarder is not extracting the fields. This means I have to use "patch" to access the fields, this is a pain. Below is a file from a forwarder, we can see fields are not extracted. Below is the same file but upload - in this case, the fields are extracted. This is the sourcetype [import_json_2] DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true TIMESTAMP_FIELDS = start_time TZ = Asia/Beirut category = Structured description = JavaScript Object Notation format. For more information, visit http://json.org/ disabled = false pulldown_type = 1   Any ideas - thanks in advance. Rob
Hi All  In my current dashboard i have several text input that colleagues can use to find varies information. Sometimes it takes a while for their information to appear.  Is there a way to add a lo... See more...
Hi All  In my current dashboard i have several text input that colleagues can use to find varies information. Sometimes it takes a while for their information to appear.  Is there a way to add a loading notification / alert to advise colleagues that Splunk is retrieving the information but may take some time?  The delay unusually is only for their 1st search and thereafter the searches are pretty much instant.  Many thanks   Paula  
Name sku kit NAC-D-CDSK-DLS-05.90 NAC-D HJA-JEOE-DNDN-94.4.0 This my data, I want to replace  with NAC-D to ANT-P for multiple values this is my search query ... See more...
Name sku kit NAC-D-CDSK-DLS-05.90 NAC-D HJA-JEOE-DNDN-94.4.0 This my data, I want to replace  with NAC-D to ANT-P for multiple values this is my search query | eval sku = if(name=="",substr(kit,0,5),substr(name,0,5)) | eval sku=case(sku =="NAC-D","ANT-P ",sku =="DHV-K","ABD-U",true(),sku)
Hi, I have created a custom app to implement ACME on search head cluster members with a script on bin folder that update files/certificates on 3 folders ./acme ./certs ./backup the content of th... See more...
Hi, I have created a custom app to implement ACME on search head cluster members with a script on bin folder that update files/certificates on 3 folders ./acme ./certs ./backup the content of these folders are required to be different on each server (deployer and 3 members). How to correctly deploy/implement this configuration? Thanking you in advance, Graça
If I have a lookup table that contains the following: error,priority Unable to find any company of ID,P2 500 Internal Server Error,P1  And result query with fields: 500 Internal Server Error:... See more...
If I have a lookup table that contains the following: error,priority Unable to find any company of ID,P2 500 Internal Server Error,P1  And result query with fields: 500 Internal Server Error: {xxx} Unable to find any company of ID: xxx Using the below query only brings back direct matches: <search query> | lookup _error_message_prority error AS ErrorMessage OUTPUTNEW Priority AS Priority Is there a way to use wildcards, 'like' or 'contains' when using lookup tables in Splunk Cloud?
Hello! According to ITSI documentation (https://docs.splunk.com/Documentation/ITSI/4.17.1/Configure/KVPerms) there is a KV store called "maintenance_calendar" that contains maintenance window detail... See more...
Hello! According to ITSI documentation (https://docs.splunk.com/Documentation/ITSI/4.17.1/Configure/KVPerms) there is a KV store called "maintenance_calendar" that contains maintenance window details. I need to run some searches on the schedules, but I cannot access the data in the KV store due to the error:  Is it possible to achieve what I am looking to do?  Thank you and best regards, Andrew
When can customers with existing SOAR instances expect to get migrated from the trial MC instance?
Hi, I want to simply know bandwidth usage by url (I span on 10s for not flooding) then I divide by 10 I wrote this, it seems ok  (but not sure) -  is it correct ? uri="/myappli/*" | timechart su... See more...
Hi, I want to simply know bandwidth usage by url (I span on 10s for not flooding) then I divide by 10 I wrote this, it seems ok  (but not sure) -  is it correct ? uri="/myappli/*" | timechart sum(eval(round(bytes/1024/1024))) AS MB span=10s Thanks    
I want to see 100% when the "No results found. " message comes.
Hi community Splunk, I have a issus when install Splunk Enterprise Security in Deployer. I have Splunk enviroment, it have 3 Search Head Cluster, 2 Indexer Cluster and 1 Master Node (Master Node is r... See more...
Hi community Splunk, I have a issus when install Splunk Enterprise Security in Deployer. I have Splunk enviroment, it have 3 Search Head Cluster, 2 Indexer Cluster and 1 Master Node (Master Node is roles of Deployer and License Master) and all version Splunk Enterprise in each components is 9.1.0.2. I want to install Splunk ES 7.2 Apps to Search Head Cluster with guild of Splunk (https://docs.splunk.com/Documentation/ES/7.2.0/Install/InstallEnterpriseSecuritySHC ) . When i install Splunk ES Apps in Deployer, an error occurs as this image : Please help me the solution of this issue. Thanks for all the contributions!  
Hi, i have 2 lookup tables, which are lookup A and B. Both of the lookups contain field Hostname and IP. There is some scenario like below: Lookup A Hostname        IP Host   A           10.10.10... See more...
Hi, i have 2 lookup tables, which are lookup A and B. Both of the lookups contain field Hostname and IP. There is some scenario like below: Lookup A Hostname        IP Host   A           10.10.10.1                            10.10.10.2 Host    B          172.1.1.1   Lookup B Hostname        IP Host   A           10.10.10.1 Host    B          172.1.1.1                            172.1.1.2 Based on scenario above,  I need a result on IP which lookup A and B does not match based on Host. But as long 1 IP in lookup A matches with lookup B, it is fine and lookup B should not have multiple IP. So, it should not match even have match IP. For your info,both lookups have multiple IPs for a host. Based on above lookup sample, Host A should match and Host B should not match based on my condition. Please assist on this. Thank you
Hi All,  We are a Splunk Cloud customer having ES.   Is there a way to fetch the ISP,  domain info for an IP address directly in the splunk results ?  I have looked at this post  : https://community.... See more...
Hi All,  We are a Splunk Cloud customer having ES.   Is there a way to fetch the ISP,  domain info for an IP address directly in the splunk results ?  I have looked at this post  : https://community.splunk.com/t5/Splunk-Search/Is-there-a-way-to-query-whois-by-ip/m-p/316975 but Domain Tools add on requires a paid subscription.   Alternatively i know that we can setup a workflow to perform whois lookup via right click implementation but that is again a manual task and it ends up redirecting us to whois website.  I am looking for something open source that can fetch me the ISP and domain for an IP-address easily.  Any thoughts or suggestions ?  Any ES users how do you accomplish this ?
Folks, I'm new to SPL worlds. Please advice right direction to learn splunk search.   Environment: proxy log search Situation: Some clients sent massive HTTP request in a small period of time to ... See more...
Folks, I'm new to SPL worlds. Please advice right direction to learn splunk search.   Environment: proxy log search Situation: Some clients sent massive HTTP request in a small period of time to various destinations. (I doubt this clients are infected by malware) How can I find this client by Splunk search with proxy or firewall log? transaction command will help to find how many sessions generated by single IP, but I don't know next steps.   
Hi, I am new to the Observability. I am looking for the steps to integrate Splunk Observability cloud with SMTP server for email notifications. I have looking in the document but could not find any... See more...
Hi, I am new to the Observability. I am looking for the steps to integrate Splunk Observability cloud with SMTP server for email notifications. I have looking in the document but could not find any specific topic as we have in splunk docs  for Splunk Enterprise/Cloud.   Please help . Rgds\Uday    
my question is very simple.  This returns nothing:   sourcetype=my_sourcetype   This returns X amount of events (same amount as index=my_index):   index=my_index AND sourcetype=my_sourcetype... See more...
my question is very simple.  This returns nothing:   sourcetype=my_sourcetype   This returns X amount of events (same amount as index=my_index):   index=my_index AND sourcetype=my_sourcetype   Search is in: Verbose Mode what am I missing?!  howcome another filter returns more events?
Hello Splunk lovers!  I stacked when i was realize kafka connect on Splunk to KafkaBroker with error "LZ4 compression not implemented". Maybe someone has already had and solved this problem.  So, h... See more...
Hello Splunk lovers!  I stacked when i was realize kafka connect on Splunk to KafkaBroker with error "LZ4 compression not implemented". Maybe someone has already had and solved this problem.  So, how can I solve this problem, please help ?