All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello all,  I could use some help here with creating a search. Ultimately I would like to know if a user is added to a specific set of security groups what security groups if any were removed from ... See more...
Hello all,  I could use some help here with creating a search. Ultimately I would like to know if a user is added to a specific set of security groups what security groups if any were removed from that same user.  Here is a search for security group removal: index=wineventlog EventCode=4729 EventCodeDescription="A member was removed from a security-enabled global group" Subject_Account_Name=srv_HiveProvSentryNe OR Subject_Account_Name=srv_HiveProvSentry source="WinEventLog:Security" sourcetype=WinEventLog | table member, Group_Name, Subject_Account_Name, _time Here is a search for security group added: index=wineventlog EventCode=4728 EventCodeDescription="A member was Added to a security-enabled global group" Subject_Account_Name=srv_HiveProvSentryNe OR Subject_Account_Name=srv_HiveProvSentry source="WinEventLog:Security" sourcetype=WinEventLog | table member, Group_Name, Subject_Account_Name, _time additional search info: EventCode=4728 Added EventCode=4729 Removed Group_Name - security group Subject_Account_Name - prov sentry member - user security groups, I would like to monitor users being added to: RDSUSers_GRSQCP01 RDSUSers_GROQCP01 RDSUSers_BRSQCP01 RDSUSers_BROQCP01 RDSUSers_VRSQCP01 RDSUSers_VROQCP01 Again I am looking to monitor if a user was added to any of the above 6 security groups were they within a few hours before and ahead of the event removed from any other groups. let me know if I can provide any additional info and as always thank you for the help.
Hi, if we upgrade license to 500GB. What is best practice Hardware architecture (CPU +RAM) based and number of "N" Search Heads, "N" indexers. How much storage per indexer we need if let's say rete... See more...
Hi, if we upgrade license to 500GB. What is best practice Hardware architecture (CPU +RAM) based and number of "N" Search Heads, "N" indexers. How much storage per indexer we need if let's say retention is 30 days and "N" installed indexers. or if at least you can share where is good .pdf for me to read with those answers. Thank you.
My data is coming for 0365 as JSON, I am using SPath to get the required fields after that i want to compare the data with a static list containig roles to be monitored but unforutnaly I am getting t... See more...
My data is coming for 0365 as JSON, I am using SPath to get the required fields after that i want to compare the data with a static list containig roles to be monitored but unforutnaly I am getting the below error Error in 'table' command: Invalid argument: 'role="Authentication Administrator"'   Its not working. PFA the releveant snap
Hi There!    I'm having the dropdown "office" in dashboard 1 as a multiselect (full office, half office), based  on the selection it should display the results in dashboard 1,    In the dashboard 1... See more...
Hi There!    I'm having the dropdown "office" in dashboard 1 as a multiselect (full office, half office), based  on the selection it should display the results in dashboard 1,    In the dashboard 1, I have a pie chart, If i click the pie chart It need to take to dashboard 2 which consists of same dropdown "office" as multiselect (full office, half office, non-compliant office),   If in case I'm clicking pie chart of dashboard 1 when office value is full office, half office, if should shows the same in dashboard 2 and in dashboard 2 has some panels, its should the using the value.  I had configured the link already, the problem is if we are adding prefix value as " and postfix " and delimiter , it will pass the same to next dashboard 2 dropdown, so that I didn't get the result of panels in dashboard 2.    I need solution for this? Thanks, Manoj Kumar S
Hello, I would like to calculate a weighted average on an average call time. The logs I have available are of this type: I want to be able to obtain the calculation of the average time this way... See more...
Hello, I would like to calculate a weighted average on an average call time. The logs I have available are of this type: I want to be able to obtain the calculation of the average time this way The formula applied is as follows:   Here is what I have done so far: index=rcd statut=OK partenaire=000000000P | eval date_appel=strftime(_time,"%b %y") | dedup nom_ws date_appel partenaire temps_rep_max temps_rep_min temps_rep_moyen nb_appel statut tranche_heure heure_appel_max | eval nb_appel_OK=if(isnotnull(nb_appel) AND statut="OK", nb_appel, null()) | eval nb_appel_KO=if(isnotnull(nb_appel) AND statut="KO",nb_appel, null()) | eval temps_rep_min_OK=if(isnotnull(temps_rep_min) AND statut="OK", temps_rep_min, null()) | eval temps_rep_min_KO=if(isnotnull(temps_rep_min) AND statut="KO",temps_rep_min, null()) | eval temps_rep_max_OK=if(isnotnull(temps_rep_max) AND statut="OK", temps_rep_max, null()) | eval temps_rep_max_KO=if(isnotnull(temps_rep_max) AND statut="KO",temps_rep_max, null()) | eval temps_rep_moyen_OK=if(isnotnull(temps_rep_moyen) AND statut="OK", temps_rep_moyen, null()) | eval temps_rep_moyen_KO=if(isnotnull(temps_rep_moyen) AND statut="KO",temps_rep_moyen, null()) | stats sum(nb_appel_OK) as nb_appel_OK, sum(nb_appel_KO) as nb_appel_KO min(temps_rep_min_OK) as temps_rep_min_OK, min(temps_rep_min_KO) as temps_rep_min_KO max(temps_rep_max_OK) as temps_rep_max_OK, max(temps_rep_max_KO) as temps_rep_max_KO, values(temps_rep_moyen_OK) AS temps_rep_moyen_OK, values(temps_rep_moyen_KO) as temps_rep_moyen_KO values(nom_ws) as nom_ws, values(date_appel) as date_appel | eval temps_rep_moyen_KO_calcul=sum(temps_rep_moyen_KO*nb_appel_KO)/(nb_appel_KO) | eval temps_rep_moyen_OK_calcul=sum(temps_rep_moyen_OK*nb_appel_OK)/(nb_appel_OK) | fields - tranche_heure_bis , tranche_heure_partenaire | sort 0 tranche_heure |table nom_ws partenaire date_appel nb_appel_OK nb_appel_KO temps_rep_min_OK temps_rep_min_KO temps_rep_max_OK temps_rep_max_KO temps_rep_moyen_OK temps_rep_moyen_KO     I cannot get the final average_ok time displayed temps_moyen= [(nb_appel_1 * temps_moyen 1)+(nb_appel_2 * temps_moyen 2)+...)/sum of nb_appel . I really need help please. Thank you so much    
Hello community, I have come across the issue when I got identical token generated for SOAR user "REST" that I am using for SIEM-SOAR integration and the same was in the Splunk app for SOAR. When I... See more...
Hello community, I have come across the issue when I got identical token generated for SOAR user "REST" that I am using for SIEM-SOAR integration and the same was in the Splunk app for SOAR. When I run "test connectivity" command on the SOAR Server Configuration, it responded with "Authentication Failed: Invalid token". I have just regenerated the token and everything works like a charm. Have you ever encountered such issue?
Hi, I would like to export a table to csv in Dashboard studio. Unfortunately when I click on export only a png is exported. Any Hint? Thank you  Best regards Marta      
Hi Splunkers,    I'm having the multiselect value that results need pass to a macros,    Can you please help for that?    The need is to pass the multiselect values to token $macros2$, where multi... See more...
Hi Splunkers,    I'm having the multiselect value that results need pass to a macros,    Can you please help for that?    The need is to pass the multiselect values to token $macros2$, where multiselect values is an macros itself, multi select values 1. value 1 2.  value 2 3. value 3 4. All   search: `macros1(`$macros2$`,  now(), -15d@d, *, virus, *, *, *)` Thanks in Advance! Manoj Kumar S
Hi,  i have the below table data where i have timecharted for 1hr time span i want to remove the row which is in red colour as it is coming with different time when compare to other data.  can ... See more...
Hi,  i have the below table data where i have timecharted for 1hr time span i want to remove the row which is in red colour as it is coming with different time when compare to other data.  can i be using outlier command to perform this operation and how i can achieve this requirement. Thank you in advance,  _time B C D E F 2023-10-06 22:00             2023-10-07 22:00             2023-10-08 22:00             2023-10-09 09:00             2023-10-09 22:00             2023-10-10 09:00             2023-10-10 22:00             2023-10-11 22:00            
Hi Team, I am trying to create a topic manually using Confluent Control Center (localhost:9021) and then using Connect-->connect-default-->Connector-->Upload connector config file I am uploading the... See more...
Hi Team, I am trying to create a topic manually using Confluent Control Center (localhost:9021) and then using Connect-->connect-default-->Connector-->Upload connector config file I am uploading the splunk sink properties which already have splunk.hec.token. But still I am getting this error "splun.hec.token" is invalid in Confluent UI(@nd screenshot) in browser. Appreciate If anybody can help here? Please note we are tryinf in Ubuntu OS and Splunk, Confluent, Kafka Connect all the components are in same network in same server.   Splunk Sink properties: name=TestConnector topics=mytopic tasks.max=1 connector.class=com.splunk.kafka.connect.SplunkSinkConnector splunk.hec.token=453a412d-029f-4fcf-a896-8c388241add0 splunk.indexes=Attest splunk.hec.uri=https://localhost:8889 splunk.hec.raw=true splunk.hec.ack.enabled=true splunk.hec.ssl.validate.cert=false splunk.hec.ack.poll.interval=20 splunk.hec.ack.poll.threads=2 splunk.hec.event.timeout=300 splunk.hec.ssl.validate.certs=false    
Is Splunk Universal Forwarder compatible with Amazon Linux?  
How can I remove the "Open in Search" (search magnifying glass) icon/option from a panel in a Dashboard Studio dashboard? I know how it's done in the Classic dashboard, but cannot work out how to do... See more...
How can I remove the "Open in Search" (search magnifying glass) icon/option from a panel in a Dashboard Studio dashboard? I know how it's done in the Classic dashboard, but cannot work out how to do it in Dashboard Studio. Thanks
A recent change to logs has broken my dashboard panels and reporting. I'm struggling to find the best way to modify my search criteria to pick up data prior to the change and after. It's a very simpl... See more...
A recent change to logs has broken my dashboard panels and reporting. I'm struggling to find the best way to modify my search criteria to pick up data prior to the change and after. It's a very simple change as single quotation marks were added around the field but it's giving me a big headache.  index=prd sourcetype=core Step=* Time=* |  timechart avg(Time) by Step span=1d Field in event log changed: FROM: Step=CONVERSION_APPLICATION TO: Step='CONVERSION_APPLICATION'  (with single quotation marks)
Hello: I recently started playing with the Risk framework, RBA etc. Most of my Risk Analysis dashboard is working within Enterprise Security - except for three (3) sections:   Risk Modifiers By A... See more...
Hello: I recently started playing with the Risk framework, RBA etc. Most of my Risk Analysis dashboard is working within Enterprise Security - except for three (3) sections:   Risk Modifiers By Annotations Risk Score By Annotations Risk Modifiers By Threat Object   For the annotations part - we do manually tag Mitre Attack tactics within our content, so not sure why these panels do not show anything. Also, does anyone know what savedsearches run in the background to populate these panels? I'd like to double check to make sure I have these enabled.   Thanks!        
I'm having trouble getting a duration between two timestamps from some extracted fields. My search looks like this:   MySearchCriteria index=MyIndex source=MySource | stats list(ExtractedFieldSt... See more...
I'm having trouble getting a duration between two timestamps from some extracted fields. My search looks like this:   MySearchCriteria index=MyIndex source=MySource | stats list(ExtractedFieldStartTime) as MyStartTime, list(ExtractedFieldEndTime) as MyEndTime by AnotherField | eval MyStartUnix=strptime(MyStartTime, "%Y-%m-%dT%H:%M:%S") | eval MyEndUnix=strptime(MyEndTime, "%Y-%m-%dT%H:%M:%S") | eval diff=MyEndUnix-MyStartUnix | table MyStartTime MyEndTime MyStartUnix MyEndUnix diff   And my table is returned as: MyStartTime MyEndTime MyStartUnix MyEndUnix diff 2023-10-10T14:48:39 2023-10-10T14:15:15 1696963719.000000 1696961715.000000   2023-10-10T14:57:50 2023-10-10T13:56:53 1696964270.000000 1696960613.000000  
I need help in regex for key and value to be extracted from raw data, below regex working with xml_kv_extraction. While its working in regex101 but not in splunk with rex, any suggesstions. <(?<fiel... See more...
I need help in regex for key and value to be extracted from raw data, below regex working with xml_kv_extraction. While its working in regex101 but not in splunk with rex, any suggesstions. <(?<field_header>[^>]+)>(?<field_value>[^<]+)<\/\1>https://regex101.com/r/IBsMhK/1  eg: events with <field_title><field_header1>field_value1</field_header1><field_header2>field_value2</field_header2></field_title> Should appear fields as below. field title = <field_header1>field_value1</field_header1><field_header2>field_value2</field_header2> field_header1=field_value1 field_header2=field_value2   1997-10-10 15:35:13.046, CREATE_DATE="1997-10-10 13:36:22.742479", LAST_UPDATE_DATE="1997-10-10 13:36:22.74", ACTION="externalFactor", STATUS="info", DATA_STRING="<?xml version="1.0" encoding="UTF-8"?> <externalFactor><current>parker</current><keywordp><encrypted>true</encrypted><keywordp>******</keywordp></keywordp><boriskhan>boriskhan1-CMX_PRTY</boriskhan></externalFactor>" 1997-10-10 15:35:13.046, CREATE_DATE="1997-10-10 13:03:58.388887", LAST_UPDATE_DATE="1997-10-10 13:03:58.388", ACTION="externalFactor.RESPONSE", STATUS="info", DATA_STRING="<?xml version="1.0" encoding="UTF-8"?> <externalFactorReturn><roleName>ROLE.CustomerManager</roleName><roleName>ROLE.DataSteward</roleName><pepres>false</pepres><externalFactor>false</externalFactor><parkeristrator>true</parkeristrator><current>parker</current></externalFactorReturn>" 1997-10-10 15:35:13.046, CREATE_DATE="1997-10-10 13:03:58.384984", LAST_UPDATE_DATE="1997-10-10 13:03:58.384", ACTION="externalFactor.RESPONSE", STATUS="info", DATA_STRING="<?xml version="1.0" encoding="UTF-8"?> <externalFactorReturn><roleName>ROLE.CustomerManager</roleName><roleName>ROLE.DataSteward</roleName><pepres>false</pepres><externalFactor>false</externalFactor><parkeristrator>true</parkeristrator><current>parker</current></externalFactorReturn>" 1997-10-10 15:35:13.046, CREATE_DATE="1997-10-10 13:03:58.384947", LAST_UPDATE_DATE="1997-10-10 13:03:58.384", ACTION="externalFactor.RESPONSE", STATUS="info", DATA_STRING="<?xml version="1.0" encoding="UTF-8"?> <externalFactorReturn><roleName>ROLE.CustomerManager</roleName><roleName>ROLE.DataSteward</roleName><pepres>false</pepres><externalFactor>false</externalFactor><parkeristrator>true</parkeristrator><current>parker</current></externalFactorReturn>" 1997-10-10 15:35:13.046, CREATE_DATE="1997-10-10 13:03:58.378965", LAST_UPDATE_DATE="1997-10-10 13:03:58.378", ACTION="externalFactor", STATUS="info", DATA_STRING="<?xml version="1.0" encoding="UTF-8"?> <externalFactor><current>parker</current><keywordp><encrypted>true</encrypted><keywordp>******</keywordp></keywordp><boriskhan>boriskhan1-CMX_PRTY</boriskhan></externalFactor>" 1997-10-10 15:35:13.046, CREATE_DATE="1997-10-10 13:03:58.374242", LAST_UPDATE_DATE="1997-10-10 13:03:58.373", ACTION="externalFactor", STATUS="info", DATA_STRING="<?xml version="1.0" encoding="UTF-8"?> <externalFactor><current>parker</current><keywordp><encrypted>true</encrypted><keywordp>******</keywordp></keywordp><boriskhan>boriskhan1-CMX_PRTY</boriskhan></externalFactor>" 1997-10-10 15:35:13.046, CREATE_DATE="1997-10-10 13:03:58.374235", LAST_UPDATE_DATE="1997-10-10 13:03:58.373", ACTION="externalFactor", STATUS="info", DATA_STRING="<?xml version="1.0" encoding="UTF-8"?> <externalFactor><current>parker</current><keywordp><encrypted>true</encrypted><keywordp>******</keywordp></keywordp><boriskhan>boriskhan1-CMX_PRTY</boriskhan></externalFactor>" 1997-10-10 15:35:13.046, CREATE_DATE="1997-10-10 13:03:58.325953", LAST_UPDATE_DATE="1997-10-10 13:03:58.325", ACTION="externalFactor.RESPONSE", STATUS="info", DATA_STRING="<?xml version="1.0" encoding="UTF-8"?> <externalFactorReturn><roleName>ROLE.CustomerManager</roleName><roleName>ROLE.DataSteward</roleName><pepres>false</pepres><externalFactor>false</externalFactor><parkeristrator>true</parkeristrator><current>parker</current></externalFactorReturn>"    @priit @PriA @yonmost @jameshgibson @bnikhil0584 
[monitor:///var/log/suricata/eve.json] disabled=true sourcetype= suricata index = suricata Currently not seeing any  eve.json data coming from the suricata box to the splunk server? We do get ot... See more...
[monitor:///var/log/suricata/eve.json] disabled=true sourcetype= suricata index = suricata Currently not seeing any  eve.json data coming from the suricata box to the splunk server? We do get other logs like the syslog but no eve.json data? Tried throwing the TA out in the APPs folder on the server that didn't work. Added index = suricata to the server and it doesn't find it. Any help would be appreciated.  Instructions on deploying the app would be nice. 
I need to search a field called DNS_Matched, that has multi-value fields, for events that have one or more values that meet the criteria of the value ending with -admin, -vip, -mgt, or does not meet ... See more...
I need to search a field called DNS_Matched, that has multi-value fields, for events that have one or more values that meet the criteria of the value ending with -admin, -vip, -mgt, or does not meet any of those three. How can I do that?  Example  DNS_Matched host1 host1-vip host1-mgt host2  host2-admin host2-mgmt host2-vip
I am aware of this site:  https://docs.splunk.com/Documentation/Splunk/7.2.10/Forwarding/Compatibilitybetweenforwardersandindexers I have several simple Splunk implementations (all functions run on ... See more...
I am aware of this site:  https://docs.splunk.com/Documentation/Splunk/7.2.10/Forwarding/Compatibilitybetweenforwardersandindexers I have several simple Splunk implementations (all functions run on one server).  My indexers are a mixture of 6.5 and 6.6. I plan on upgrading to 7.2.10 with the eventual goal of getting to the latest version. First, I'd like to understand what forwarders can communicate with indexers.  The link above relates to 7.0.0 and later.  I'm at 6.5/6.6 as stated earlier. Secondly, I know I need to upgrade splunk to various incremental versions before I get to 9.x.   What is the recommended path to upgrading to 9.x?  Since I'm 6.5 or 6.6, I believe my next is 7.2.10 (is that right?).  But what is the path after that? Thanks for the help!  
Can someone help me with the Splunk code that would be necessary to search for the Idemia Machines? Thank you Anthony