All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

using UF to send json file and below are the props.conf. [test_json] pulldown_type = true LINE_BREAKER = ([\r\n]+) INDEXED_EXTRACTIONS = json KV_MODE = none SHOULD_LINEMERGE = true AUTO_KV_JSO... See more...
using UF to send json file and below are the props.conf. [test_json] pulldown_type = true LINE_BREAKER = ([\r\n]+) INDEXED_EXTRACTIONS = json KV_MODE = none SHOULD_LINEMERGE = true AUTO_KV_JSON = false category = Structured and from the inputs.conf also contain crcSalt = <SOURCE>   result keep showing as below   AB-17[3] AB-17[3] XY-17[2] XY-17[2] SI-17[1] SI-17[1]   can't figure out the problem.
Hi There!    I would like to include/exclude weekend in the search, So i had created the dropdown for that, I'm getting error in the searches, My Time format is  2023-10-15T13:11:20.000+05:30 ... See more...
Hi There!    I would like to include/exclude weekend in the search, So i had created the dropdown for that, I'm getting error in the searches, My Time format is  2023-10-15T13:11:20.000+05:30 My dropdown is <input type="radio" token="weekends" searchWhenChanged="true"> <label>Weekends</label> <choice value="NOT (day_of_week=&quot;saturday&quot; OR day_of_week=&quot;sunday&quot;)">Exclude Weekends</choice> <choice value="day_of_week=&quot;*&quot;">Include Weekends</choice> <default>NOT (day_of_week="saturday" OR day_of_week="sunday")</default> <initialValue>NOT (day_of_week="saturday" OR day_of_week="sunday")</initialValue> </input> My search is `compliance("`console`", now(), -15d@d, mcafee,*, virus_, *, *, *)` | eval day_of_week = lower(strftime(_time,"%A")) | where NOT (day_of_week="saturday" OR day_of_week="sunday") | chart count by virus_global | sort virus_global Thanks!
I have this multivalue fields where i am tring to rex and get particular field value like  "value":"ESC1000",  but instead getting multiple. Tried with this but shows all the fields value  | rex m... See more...
I have this multivalue fields where i am tring to rex and get particular field value like  "value":"ESC1000",  but instead getting multiple. Tried with this but shows all the fields value  | rex max_match=0 "value\":\"(?<ESC>[^\"]+)" "productChanges":[ { "Products":[ { "productSpecContainmentID":"1", "changedCharacteristics":[ { "characteristicID":"2", "value":"SERVICE" }, { "characteristicID":"3", "value":"99" }, { "characteristicID":"4", "value":"monthly" }, { "characteristicID":"5", "value":"ESC1000" }, { "characteristicID":"6", "value":"Discount" }, { "characteristicID":"7", "value":"Escalation" }, { "characteristicID":"8", "value":"AMOUNT" }, { "characteristicID":"9", "value":"9" }, { "characteristicID":"10", "value":"Y" }, { "characteristicID":"11", "value":"N" } ], "temporaryID":"xxxaaaacccc" } ] } ] Is there a way to get the required fields value only like above?
Hi. I want to create a search that checks for last user login date in AWS. I can see them in AWS IAM and there are bunch of them that has last time logged in more than 300 days ago, but would be ni... See more...
Hi. I want to create a search that checks for last user login date in AWS. I can see them in AWS IAM and there are bunch of them that has last time logged in more than 300 days ago, but would be nice to see them in SPL. I would appreciate if anyone can share their search or could tell me how to build one.   Thank you.
I am trying to use my windows event data to update users ID on panorama, however, running the below query in my es environment returns the error : External search command 'panuserupdate' returned err... See more...
I am trying to use my windows event data to update users ID on panorama, however, running the below query in my es environment returns the error : External search command 'panuserupdate' returned error code 2. Script output = "ERROR Unable to get apikey from firewall: local variable 'username' referenced before assignment ". The Query index=wineventlog host=xxxxxx | mvexpand Security_ID | mvexpand Source_Network_Address | dedup Security_ID Source_Network_Address | search Security_ID!="NULL SID" | rename Security_ID as user | rename Source_Network_Address as src_ip | panuserupdate panorama=x.x.x.x serial=000000000000 | fields user src_ip Brief overview of my data ingestion: Panorama syslog is ingested to splunk cloud through Heavy forwarder. Palo Alto Add on for Splunk is installed on both HF and Splunk Cloud also, but no data is showing on the app. Every data is 0 0.  Also I do have a user account in Panorama with api permissions.
Hi guys, I want to detect a service ticket (TGS) request (Windows event code 4769) that is not preceded by one of the following corresponding events: 1. User ticket (TGT) request, Windows event co... See more...
Hi guys, I want to detect a service ticket (TGS) request (Windows event code 4769) that is not preceded by one of the following corresponding events: 1. User ticket (TGT) request, Windows event code 4768. 2. Ticket renewal request, Windows event code 4770. 3. Logon event, Windows event code 4624.   The following is the SPL I wrote, but I found that there is a problem, could you help me to modify it?  index="xx" | transaction user maxspan=24h maxpause=10h startwith=("Eventcode=4768", "Eventcode=4770", "Eventcode=4624") endswit="Eventcode=4769" keepevicted=true | search Eventcode=4769 NOT (Eventcode=4768 OR Eventcode=4770 OR Eventcode=4624)
Hi, Need an spl  from src_ip to dest_ip  would like to know the dest_url, logs and outbound traffic size.  
In some cases, I encounter problems with parsing data using CIM datamodel on windows event log data.   For example,  when searching for deleted and created user accounts using datamodel: | fro... See more...
In some cases, I encounter problems with parsing data using CIM datamodel on windows event log data.   For example,  when searching for deleted and created user accounts using datamodel: | from datamodel:"Change"."Account_Management" | search (action="created" OR action="deleted") "User" field is calulated correctly for "created" 4720 events.  This field calculation does not work correctly for 4726 events, where user and source user fields are returned as unknwon (even though they are present in raw log data).  I am using Splunk TA for windows to ingest data.    What may be the cause of this behavior? 
Hi Team,   my requirement is write request is one event and Change Item into another event, please help me how to break the events       12:49:28 PM WriteRequest Remotexxxxxxxxxx+=0 12:49:28 P... See more...
Hi Team,   my requirement is write request is one event and Change Item into another event, please help me how to break the events       12:49:28 PM WriteRequest Remotexxxxxxxxxx+=0 12:49:28 PM WriteRequest Remotexxxxxxxxxxxxxx-=0 12:49:28 PM WriteRequest xxxxxxxx=ABEMA150 12:50:22 PM ChangeItem StatusDevices.xxxxxxxx=1 12:50:22 PM ChangeItem CurrentTest.DateEnd=25.06.2023 12:50:22 12:50:22 PM ChangeItem CurrentTesxxxxxxx=2
Dear Team, We are planning to upgrade our existing underlying OS/VM infrastructure. As part of this process, we need to ensure the backup and restoration of our Splunk environment in case any issues... See more...
Dear Team, We are planning to upgrade our existing underlying OS/VM infrastructure. As part of this process, we need to ensure the backup and restoration of our Splunk environment in case any issues arise. Below, you can find the details of our environment: Search Head Cluster (SHC) A standalone Splunk Security SH. Indexer cluster All other management servers(DS/CM/deployer/LM) Heavy Forwarders/Universal Forwarders (UFs) In addition to backing up $SPLUNK_HOME/etc and $SPLUNK_HOME/var, as well as the kvstore, are there any other components or data that we need to back up to ensure a successful restoration process?
While doing a splunk search using a splunk query and retrieving logs in an automated matter, the job extraction only a maximum of 2 lakhs of logs. How to resolve this issue
index=abcd | stats count(eval(searchmatch(''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'))) as ''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CK... See more...
index=abcd | stats count(eval(searchmatch(''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'))) as ''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'  OR count(eval(searchmatch('value2')))  as 'value2' I'm getting this error: Error in 'stats' command: The argument '''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'' is invalid.     this works fine with many other URLs and ips, is there any special character that is not allowed with stats?
Hi, I have a modular input that is connected to CIM through eventtypes and tags as follows: default/eventtypes.conf [my_event_type] search = sourcetype=my_default_source_type default/tags... See more...
Hi, I have a modular input that is connected to CIM through eventtypes and tags as follows: default/eventtypes.conf [my_event_type] search = sourcetype=my_default_source_type default/tags.conf [eventtype=my_event_type] alert = enabled This generally works up until a user decides to reconfigure the default sourcetype and index. When the sourcetype is altered the eventtype stanza breaks. How to go about this? What is the best practive to allow a user to reconfigure sourcetype while ensuring the CIM integration works.  Should I relay on other fields that I create?  Can this damage the efficiency of the query?  Thanks
How to import Beautifulsoup file in splunk python to get the python script work.
Hi All, We tried to use SentinelOne SOAR app to implement playbook to block hash on SentinelOne. SentinelOne SOAR App: https://splunkbase.splunk.com/app/6056   We found that it applied hash only ... See more...
Hi All, We tried to use SentinelOne SOAR app to implement playbook to block hash on SentinelOne. SentinelOne SOAR App: https://splunkbase.splunk.com/app/6056   We found that it applied hash only some groups on SentinelOne. Does anyone found this issue before? Please advise.
Hello, I want to detect workstations authenticated to the active directory that are not compliant with our naming conventions.( hostname should start with the country code followed by 6 numbers. Exe... See more...
Hello, I want to detect workstations authenticated to the active directory that are not compliant with our naming conventions.( hostname should start with the country code followed by 6 numbers. Exemple for a host from italy :IT000121). I have already a lookup file (| inputlookup code_countries.csv | table alpha-2), but I don't know how to compare it with the 'Workstation' field in my active index to make it match the naming convention I described above.   Regards,
Will i am seeing the events data is showing but there is sourcetype is missing for last 24 hours. What could be the reason , how to check .
In the process of Splunk Integration with lastPass , we are getting an error like  "Your SIEM refused to connect"   Please 
Hi I have the use case that i need to find some direct links between different events of the same index and sourcetype. The result should show me three different bars: bar 1: count of the existing... See more...
Hi I have the use case that i need to find some direct links between different events of the same index and sourcetype. The result should show me three different bars: bar 1: count of the existing links (incl. filter criteria matching) bar 2: count of the existing links where filter criteria dont match bar 3: count of the events where there is no existing link at all I came so far to make use of the "left join" to not loose the "not matching" events but now I dont know how to differiance them into a bar diagram or with an if condition to count them. It needs to be counted weekly. Can you help me please? This is my current query state: index=A | rename Name as TargetName | join type=left max=0 TargetName    [ search index=A    | fields TargetName ID Status] | join type=left SourceID    [ search index=A    | fields SourceID, type] | join type=left TargetID    [ search index=A    | fields TargetID] | bin span=1w@w0 _time | eval state=if(match(status,"Done") OR match(status,"Pending"), "Link + State is there", if (NOT match(status,"Done") OR NOT match(status,"Pending"), "State is missing", "No Link")) | dedup ID _time sortby -state | timechart span=1w@w0 count by state Somehow I can not make it work to get all "non matching" aka. the "No Link" events. Is the "if" the right way to get what I need? Do i need to add another "eval" within each join? And if yes, how to do that? Thank you for every help! This should be my result (see screenshot).
Hello All, I have a requirement on the dropdowns, I have a following lookup file which contains application, environment and index details, I need to get the environment details related to each appl... See more...
Hello All, I have a requirement on the dropdowns, I have a following lookup file which contains application, environment and index details, I need to get the environment details related to each application when i choose app details from the dropdown, similarly with the index dropdown, it must only give the index details based on the values that i choose in the application and environment dropdowns. I could get the desired results while using the lookup file. But how can this be achieved using eval condition in the splunk dashboard rather than using the lookup file. I have the values of the fields in the splunk results. application environment index app_a DEV aws-app_a_npd app_a PPR aws-app_a_ppr app_a TEST aws-app_a_test app_a SUP aws-app_a_sup app_a PROD aws-app_a_prod app_b NPD aws-app_b_npd app_b SUP aws-app_b_sup app_b PROD aws-app_b_prod app_c NPD aws-app_c_npd app_c SUP aws-app_c_sup app_c PROD aws-app_c_prod