All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @gcusello , Eg : from  specific src_ip= xx.xx.xx.xx to dest_ip =xx.xx.xx.xx
Hi @gcusello , it does , we are doing indexer level props & trasnforms for other sourcetypes as well & it is working fine.  document also says the same Manage private apps on your Splunk Cloud Pla... See more...
Hi @gcusello , it does , we are doing indexer level props & trasnforms for other sourcetypes as well & it is working fine.  document also says the same Manage private apps on your Splunk Cloud Platform deployment - Splunk Documentation "When you install an app using self-service app installation on Classic Experience, the app is automatically installed on all regular search heads and search head cluster members across your deployment. The app is also installed on indexers"
Hello @gcusello  No, i want to find Computernames that are not conform to a naming convention. The Computer  name should start with the country code (e.g., Italy: IT, France: FR, USA: US), then fo... See more...
Hello @gcusello  No, i want to find Computernames that are not conform to a naming convention. The Computer  name should start with the country code (e.g., Italy: IT, France: FR, USA: US), then followed by 6 digits. Computer Name: US111220 => Good Computer Name: DESKTOP-121 => BAD Computer Name: FR000121 => Good Computer Name: Kali => BAD Best Regards,
  12:49:28 PM WriteRequest Remotexxxxxxxxxx+=0 12:49:28 PM WriteRequest Remotexxxxxxxxxxxxxx-=0 12:49:28 PM WriteRequest xxxxxxxx=ABEMA150   Into One event   12:50:22 PM ChangeItem StatusDevic... See more...
  12:49:28 PM WriteRequest Remotexxxxxxxxxx+=0 12:49:28 PM WriteRequest Remotexxxxxxxxxxxxxx-=0 12:49:28 PM WriteRequest xxxxxxxx=ABEMA150   Into One event   12:50:22 PM ChangeItem StatusDevices.xxxxxxxx=1 12:50:22 PM ChangeItem CurrentTest.DateEnd=25.06.2023 12:50:22 12:50:22 PM ChangeItem CurrentTesxxxxxxx=2   into another event Line_BREAKER= ?   Please help me  
Please tell us more about the automated method you are using.  Which method is it?  Does it specify the maxout parameter? You may find some help at https://hurricanelabs.com/splunk-tutorials/the-bes... See more...
Please tell us more about the automated method you are using.  Which method is it?  Does it specify the maxout parameter? You may find some help at https://hurricanelabs.com/splunk-tutorials/the-best-guide-for-exporting-massive-amounts-of-data-from-splunk
Hi @AL3Z, your request isn't so cliear: you want all the events from a src_ip to a src_ip, then the list of dest_urls and outbound traffic size, is it correct? you should try something like this: ... See more...
Hi @AL3Z, your request isn't so cliear: you want all the events from a src_ip to a src_ip, then the list of dest_urls and outbound traffic size, is it correct? you should try something like this: index=your_index sourcetype=your_sourcetype | stats values(dest_url) AS dest_url values(logs) AS logs sum(bytes_in) AS bytes_in sum(bytes_out) AS bytes_out BY src_ip dest_ip | eval traffic_MB_size=(bytes_in+bytes_out)/1024/1024 Ciao. Giuseppe
Hi guys, I want to detect a service ticket (TGS) request (Windows event code 4769) that is not preceded by one of the following corresponding events: 1. User ticket (TGT) request, Windows event co... See more...
Hi guys, I want to detect a service ticket (TGS) request (Windows event code 4769) that is not preceded by one of the following corresponding events: 1. User ticket (TGT) request, Windows event code 4768. 2. Ticket renewal request, Windows event code 4770. 3. Logon event, Windows event code 4624.   The following is the SPL I wrote, but I found that there is a problem, could you help me to modify it?  index="xx" | transaction user maxspan=24h maxpause=10h startwith=("Eventcode=4768", "Eventcode=4770", "Eventcode=4624") endswit="Eventcode=4769" keepevicted=true | search Eventcode=4769 NOT (Eventcode=4768 OR Eventcode=4770 OR Eventcode=4624)
Hi, Need an spl  from src_ip to dest_ip  would like to know the dest_url, logs and outbound traffic size.  
Hi @Sid , I'm not sure that uploading an app in Splunk Cloud it's located in the Indexers. Opena a case to Splunk Support for this. Ciao. Giuseppe
With this filter i see all combined risk classification per host.  index=test Risk IN (Critical,High,Medium) | timechart span=30 count by extracted_Host I'm now trying to filter and visualize so... See more...
With this filter i see all combined risk classification per host.  index=test Risk IN (Critical,High,Medium) | timechart span=30 count by extracted_Host I'm now trying to filter and visualize so I can see how often the host has the rating Critical and how often high etc.  and  not like now that i see only the combined value of all risk classification index=test | stats count by extracted_Host, Risk | stats values(Risk) as Risk by extracted_Host | eval has_Critical=mvcount(split(Risk, ",")) > 0 | eval has_High=mvcount(split(Risk, ",")) > 0 | eval has_Medium=mvcount(split(Risk, ",")) > 0 | stats sum(has_Critical) as Critical_Count, sum(has_High) as High_Count, sum(has_Medium) as Medium_Count by extracted_Host but i dont get an output Thanks for the help
Your app that needs BeautifulSoup must include it in its bin/lib directory.  Do not attempt to change or add to the Python libraries delivered with Splunk.
No matter how foolproof your configuration some clever fool will find a way to break it. Educate your users about the importance of letting you know about changes to sourcetypes so you can make the ... See more...
No matter how foolproof your configuration some clever fool will find a way to break it. Educate your users about the importance of letting you know about changes to sourcetypes so you can make the necessary changes to configurations. IMO, sourcetypes should not change unless the structure/format of the data changes. As for the efficiency question, it depends.  If sourcetypes and indexes change often enough that searches must look for multiples of each then, yes, efficiency is affected.  Searching two indexes is less efficient than searching one index.
Where you ever able to solve this issue?
Hi @nivi ... please check the limits.conf file..  [searchresults] maxresultrows = 50000 # maximum number of times to try in the atomic write operation (1 = no retries) https://docs.splunk.com/Docum... See more...
Hi @nivi ... please check the limits.conf file..  [searchresults] maxresultrows = 50000 # maximum number of times to try in the atomic write operation (1 = no retries) https://docs.splunk.com/Documentation/Splunk/9.1.1/Admin/Limitsconf#limits.conf.example   may we know more details like.. is it a csv log or regular log file or something else.. Splunk enterprise or Splunk cloud? are you planning to increase or decrease this limit (2 lakh logs itself is a very big limit).
In some cases, I encounter problems with parsing data using CIM datamodel on windows event log data.   For example,  when searching for deleted and created user accounts using datamodel: | fro... See more...
In some cases, I encounter problems with parsing data using CIM datamodel on windows event log data.   For example,  when searching for deleted and created user accounts using datamodel: | from datamodel:"Change"."Account_Management" | search (action="created" OR action="deleted") "User" field is calulated correctly for "created" 4720 events.  This field calculation does not work correctly for 4726 events, where user and source user fields are returned as unknwon (even though they are present in raw log data).  I am using Splunk TA for windows to ingest data.    What may be the cause of this behavior? 
Hi @vijreddy30 ... Simplest one... you can do line-breaking with each line (as you are having timestamps nicely) if you have any other "special requirements" with this logs, then, if you could upda... See more...
Hi @vijreddy30 ... Simplest one... you can do line-breaking with each line (as you are having timestamps nicely) if you have any other "special requirements" with this logs, then, if you could update us more details, we could help you on the new line breaking ideas. thanks. 
Hi Team,   my requirement is write request is one event and Change Item into another event, please help me how to break the events       12:49:28 PM WriteRequest Remotexxxxxxxxxx+=0 12:49:28 P... See more...
Hi Team,   my requirement is write request is one event and Change Item into another event, please help me how to break the events       12:49:28 PM WriteRequest Remotexxxxxxxxxx+=0 12:49:28 PM WriteRequest Remotexxxxxxxxxxxxxx-=0 12:49:28 PM WriteRequest xxxxxxxx=ABEMA150 12:50:22 PM ChangeItem StatusDevices.xxxxxxxx=1 12:50:22 PM ChangeItem CurrentTest.DateEnd=25.06.2023 12:50:22 12:50:22 PM ChangeItem CurrentTesxxxxxxx=2
Dear Team, We are planning to upgrade our existing underlying OS/VM infrastructure. As part of this process, we need to ensure the backup and restoration of our Splunk environment in case any issues... See more...
Dear Team, We are planning to upgrade our existing underlying OS/VM infrastructure. As part of this process, we need to ensure the backup and restoration of our Splunk environment in case any issues arise. Below, you can find the details of our environment: Search Head Cluster (SHC) A standalone Splunk Security SH. Indexer cluster All other management servers(DS/CM/deployer/LM) Heavy Forwarders/Universal Forwarders (UFs) In addition to backing up $SPLUNK_HOME/etc and $SPLUNK_HOME/var, as well as the kvstore, are there any other components or data that we need to back up to ensure a successful restoration process?
While doing a splunk search using a splunk query and retrieving logs in an automated matter, the job extraction only a maximum of 2 lakhs of logs. How to resolve this issue
index=abcd | stats count(eval(searchmatch(''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'))) as ''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CK... See more...
index=abcd | stats count(eval(searchmatch(''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'))) as ''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'  OR count(eval(searchmatch('value2')))  as 'value2' I'm getting this error: Error in 'stats' command: The argument '''https://drive.google.com/uc?export=download&id=1HGFF5ziAFGn8161CKQC$Xyuhni9PNK_X'' is invalid.     this works fine with many other URLs and ips, is there any special character that is not allowed with stats?