All Topics

Top

All Topics

After migrating to Splunk 9.1.1, all of the controls under: splunk/search_mrsparkle/exposed/js/views/shared/controls/ are no longer there. Doing a search found them under: splunk/quarantined_files/s... See more...
After migrating to Splunk 9.1.1, all of the controls under: splunk/search_mrsparkle/exposed/js/views/shared/controls/ are no longer there. Doing a search found them under: splunk/quarantined_files/share/splunk/search_mrsparkle/exposed/js/views/shared/controls folder.  Was there a reason all of the controls where moved?  I looked under the Release docs, but didn't find anything on this topic. Is there a reason all of these are quarantined or can I move them all back?
Hi, Is it possible to fetch the account access key, which is under the license page, using the API command, instead of getting it from the controller Regards, Mohammed Saad
I'm trying to create a visual dashboard  (specifically a column graph or bar chart) using  index=guardium ruleDesc="OS Command Injection" | stats count by dbUser, DBName, serviceName, sql   This ... See more...
I'm trying to create a visual dashboard  (specifically a column graph or bar chart) using  index=guardium ruleDesc="OS Command Injection" | stats count by dbUser, DBName, serviceName, sql   This is the graph I get: I would like to group these fields into categories on the chart where one part would show count of 1-5 then 6-10...and so on.  Then I could drill down a specific bar within the count group to view the fields for that bar in a table format.  How would I go about doing this.  I am new to splunk and have been stuck finding the best way to represent this data.  I was given this search statement and was told to make a visual dashboard of it.
On this page it states that we have to use An Amazon S3 or S3-API-compliant remote object storage location Azure blob storage Does anyone know if this will work with google cloud buckets as well?
using UF to send json file and below are the props.conf. [test_json] pulldown_type = true LINE_BREAKER = ([\r\n]+) INDEXED_EXTRACTIONS = json KV_MODE = none SHOULD_LINEMERGE = true AUTO_KV_JSO... See more...
using UF to send json file and below are the props.conf. [test_json] pulldown_type = true LINE_BREAKER = ([\r\n]+) INDEXED_EXTRACTIONS = json KV_MODE = none SHOULD_LINEMERGE = true AUTO_KV_JSON = false category = Structured and from the inputs.conf also contain crcSalt = <SOURCE>   result keep showing as below   AB-17[3] AB-17[3] XY-17[2] XY-17[2] SI-17[1] SI-17[1]   can't figure out the problem.
Hi There!    I would like to include/exclude weekend in the search, So i had created the dropdown for that, I'm getting error in the searches, My Time format is  2023-10-15T13:11:20.000+05:30 ... See more...
Hi There!    I would like to include/exclude weekend in the search, So i had created the dropdown for that, I'm getting error in the searches, My Time format is  2023-10-15T13:11:20.000+05:30 My dropdown is <input type="radio" token="weekends" searchWhenChanged="true"> <label>Weekends</label> <choice value="NOT (day_of_week=&quot;saturday&quot; OR day_of_week=&quot;sunday&quot;)">Exclude Weekends</choice> <choice value="day_of_week=&quot;*&quot;">Include Weekends</choice> <default>NOT (day_of_week="saturday" OR day_of_week="sunday")</default> <initialValue>NOT (day_of_week="saturday" OR day_of_week="sunday")</initialValue> </input> My search is `compliance("`console`", now(), -15d@d, mcafee,*, virus_, *, *, *)` | eval day_of_week = lower(strftime(_time,"%A")) | where NOT (day_of_week="saturday" OR day_of_week="sunday") | chart count by virus_global | sort virus_global Thanks!
I have this multivalue fields where i am tring to rex and get particular field value like  "value":"ESC1000",  but instead getting multiple. Tried with this but shows all the fields value  | rex m... See more...
I have this multivalue fields where i am tring to rex and get particular field value like  "value":"ESC1000",  but instead getting multiple. Tried with this but shows all the fields value  | rex max_match=0 "value\":\"(?<ESC>[^\"]+)" "productChanges":[ { "Products":[ { "productSpecContainmentID":"1", "changedCharacteristics":[ { "characteristicID":"2", "value":"SERVICE" }, { "characteristicID":"3", "value":"99" }, { "characteristicID":"4", "value":"monthly" }, { "characteristicID":"5", "value":"ESC1000" }, { "characteristicID":"6", "value":"Discount" }, { "characteristicID":"7", "value":"Escalation" }, { "characteristicID":"8", "value":"AMOUNT" }, { "characteristicID":"9", "value":"9" }, { "characteristicID":"10", "value":"Y" }, { "characteristicID":"11", "value":"N" } ], "temporaryID":"xxxaaaacccc" } ] } ] Is there a way to get the required fields value only like above?
Hi. I want to create a search that checks for last user login date in AWS. I can see them in AWS IAM and there are bunch of them that has last time logged in more than 300 days ago, but would be ni... See more...
Hi. I want to create a search that checks for last user login date in AWS. I can see them in AWS IAM and there are bunch of them that has last time logged in more than 300 days ago, but would be nice to see them in SPL. I would appreciate if anyone can share their search or could tell me how to build one.   Thank you.
I am trying to use my windows event data to update users ID on panorama, however, running the below query in my es environment returns the error : External search command 'panuserupdate' returned err... See more...
I am trying to use my windows event data to update users ID on panorama, however, running the below query in my es environment returns the error : External search command 'panuserupdate' returned error code 2. Script output = "ERROR Unable to get apikey from firewall: local variable 'username' referenced before assignment ". The Query index=wineventlog host=xxxxxx | mvexpand Security_ID | mvexpand Source_Network_Address | dedup Security_ID Source_Network_Address | search Security_ID!="NULL SID" | rename Security_ID as user | rename Source_Network_Address as src_ip | panuserupdate panorama=x.x.x.x serial=000000000000 | fields user src_ip Brief overview of my data ingestion: Panorama syslog is ingested to splunk cloud through Heavy forwarder. Palo Alto Add on for Splunk is installed on both HF and Splunk Cloud also, but no data is showing on the app. Every data is 0 0.  Also I do have a user account in Panorama with api permissions.
Hi guys, I want to detect a service ticket (TGS) request (Windows event code 4769) that is not preceded by one of the following corresponding events: 1. User ticket (TGT) request, Windows event co... See more...
Hi guys, I want to detect a service ticket (TGS) request (Windows event code 4769) that is not preceded by one of the following corresponding events: 1. User ticket (TGT) request, Windows event code 4768. 2. Ticket renewal request, Windows event code 4770. 3. Logon event, Windows event code 4624.   The following is the SPL I wrote, but I found that there is a problem, could you help me to modify it?  index="xx" | transaction user maxspan=24h maxpause=10h startwith=("Eventcode=4768", "Eventcode=4770", "Eventcode=4624") endswit="Eventcode=4769" keepevicted=true | search Eventcode=4769 NOT (Eventcode=4768 OR Eventcode=4770 OR Eventcode=4624)
Hi, Need an spl  from src_ip to dest_ip  would like to know the dest_url, logs and outbound traffic size.  
In some cases, I encounter problems with parsing data using CIM datamodel on windows event log data.   For example,  when searching for deleted and created user accounts using datamodel: | fro... See more...
In some cases, I encounter problems with parsing data using CIM datamodel on windows event log data.   For example,  when searching for deleted and created user accounts using datamodel: | from datamodel:"Change"."Account_Management" | search (action="created" OR action="deleted") "User" field is calulated correctly for "created" 4720 events.  This field calculation does not work correctly for 4726 events, where user and source user fields are returned as unknwon (even though they are present in raw log data).  I am using Splunk TA for windows to ingest data.    What may be the cause of this behavior? 
Hi Team,   my requirement is write request is one event and Change Item into another event, please help me how to break the events       12:49:28 PM WriteRequest Remotexxxxxxxxxx+=0 12:49:28 P... See more...
Hi Team,   my requirement is write request is one event and Change Item into another event, please help me how to break the events       12:49:28 PM WriteRequest Remotexxxxxxxxxx+=0 12:49:28 PM WriteRequest Remotexxxxxxxxxxxxxx-=0 12:49:28 PM WriteRequest xxxxxxxx=ABEMA150 12:50:22 PM ChangeItem StatusDevices.xxxxxxxx=1 12:50:22 PM ChangeItem CurrentTest.DateEnd=25.06.2023 12:50:22 12:50:22 PM ChangeItem CurrentTesxxxxxxx=2
Dear Team, We are planning to upgrade our existing underlying OS/VM infrastructure. As part of this process, we need to ensure the backup and restoration of our Splunk environment in case any issues... See more...
Dear Team, We are planning to upgrade our existing underlying OS/VM infrastructure. As part of this process, we need to ensure the backup and restoration of our Splunk environment in case any issues arise. Below, you can find the details of our environment: Search Head Cluster (SHC) A standalone Splunk Security SH. Indexer cluster All other management servers(DS/CM/deployer/LM) Heavy Forwarders/Universal Forwarders (UFs) In addition to backing up $SPLUNK_HOME/etc and $SPLUNK_HOME/var, as well as the kvstore, are there any other components or data that we need to back up to ensure a successful restoration process?
While doing a splunk search using a splunk query and retrieving logs in an automated matter, the job extraction only a maximum of 2 lakhs of logs. How to resolve this issue
index=abcd | stats count(eval(searchmatch(''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'))) as ''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CK... See more...
index=abcd | stats count(eval(searchmatch(''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'))) as ''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'  OR count(eval(searchmatch('value2')))  as 'value2' I'm getting this error: Error in 'stats' command: The argument '''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'' is invalid.     this works fine with many other URLs and ips, is there any special character that is not allowed with stats?
Hi, I have a modular input that is connected to CIM through eventtypes and tags as follows: default/eventtypes.conf [my_event_type] search = sourcetype=my_default_source_type default/tags... See more...
Hi, I have a modular input that is connected to CIM through eventtypes and tags as follows: default/eventtypes.conf [my_event_type] search = sourcetype=my_default_source_type default/tags.conf [eventtype=my_event_type] alert = enabled This generally works up until a user decides to reconfigure the default sourcetype and index. When the sourcetype is altered the eventtype stanza breaks. How to go about this? What is the best practive to allow a user to reconfigure sourcetype while ensuring the CIM integration works.  Should I relay on other fields that I create?  Can this damage the efficiency of the query?  Thanks
How to import Beautifulsoup file in splunk python to get the python script work.
Hi All, We tried to use SentinelOne SOAR app to implement playbook to block hash on SentinelOne. SentinelOne SOAR App: https://splunkbase.splunk.com/app/6056   We found that it applied hash only ... See more...
Hi All, We tried to use SentinelOne SOAR app to implement playbook to block hash on SentinelOne. SentinelOne SOAR App: https://splunkbase.splunk.com/app/6056   We found that it applied hash only some groups on SentinelOne. Does anyone found this issue before? Please advise.
Hello, I want to detect workstations authenticated to the active directory that are not compliant with our naming conventions.( hostname should start with the country code followed by 6 numbers. Exe... See more...
Hello, I want to detect workstations authenticated to the active directory that are not compliant with our naming conventions.( hostname should start with the country code followed by 6 numbers. Exemple for a host from italy :IT000121). I have already a lookup file (| inputlookup code_countries.csv | table alpha-2), but I don't know how to compare it with the 'Workstation' field in my active index to make it match the naming convention I described above.   Regards,