All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

As @sylim_splunk already stated. Its managed by the OS. If your memory usage however is minimal and swap is completely used, it is usually no problem. Especially on modern Servers with NVMe SSD sto... See more...
As @sylim_splunk already stated. Its managed by the OS. If your memory usage however is minimal and swap is completely used, it is usually no problem. Especially on modern Servers with NVMe SSD storage. If you really dont want the system to swap you can disable swap via: sudo swapoff -a Keep in mind in case the system uses all RAM and swap is off, the OOM-Killer in Linux might kill your splunk processes, which can lead to loss of searches/searchresults.
Thats correct. Hence its important to secure your Splunk environment properly. Use MFA and SAML in example. For more: https://docs.splunk.com/Documentation/Splunk/latest/Security/Whatyoucansecurewi... See more...
Thats correct. Hence its important to secure your Splunk environment properly. Use MFA and SAML in example. For more: https://docs.splunk.com/Documentation/Splunk/latest/Security/WhatyoucansecurewithSplunk
To check if a field contains Unicode characters, you can use the regex command with a regular expression that matches non-ASCII characters, but if you're wanting to do filtering you might be better w... See more...
To check if a field contains Unicode characters, you can use the regex command with a regular expression that matches non-ASCII characters, but if you're wanting to do filtering you might be better with something like match. index=email | eval is_unicode = if(match(from_header_displayname, "[^\x00-\x7F]"), "true", "false") | where is_unicode="true"   This search uses the match function to check if the from_header_displayname field contains any characters outside the ASCII range (\x00-\x7F). If it does, the is_unicode field is set to "true". Alternatively, you can directly filter the events using the where command with the match function.   index=email | where match(from_header_displayname, "[^\x00-\x7F]") Here is another working example: | makeresults | eval from_header_displayname="support@\u0445.comx.com" | eval from_header_displayname_unicode="support@х.comx.com" | table from_header_displayname from_header_displayname_unicode | eval unicode_detected_raw=if(match(from_header_displayname,"[^\x00-\x7F]"),"Yes","No") | eval unicode_detected_unicode=if(match(from_header_displayname_unicode,"[^\x00-\x7F]"),"Yes","No") | table from_header_displayname unicode_detected_raw from_header_displayname_unicode unicode_detected_unicode Both of these approaches will help you identify events where the from_header_displayname field contains Unicode characters.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Unicode includes ASCII characters, so 0000-ffff would include all 16 bit characters. If you are looking for any 16 bit characters you could do either of these | eval hasUncode=if(match(string, "[^[:... See more...
Unicode includes ASCII characters, so 0000-ffff would include all 16 bit characters. If you are looking for any 16 bit characters you could do either of these | eval hasUncode=if(match(string, "[^[:ascii:]]"), "HAS-NON-ASCII", "ASCII") | eval hasUncode=if(match(string, "[^\x00-\xff]"), "HAS-16 BIT CHARS", "8-BIT") The first character class is ascii and is checking for any characters NOT in the ascii range (0x00-0x7f) and the second is checking for any non 8 bit characters. So, this example which includes your lower case Cyrillic x  demonstrates | makeresults | eval string=printf("{\"from_header_displayname\": \"'support@%c.comx.com'\"}", 1024+69) | eval hasUncode1=if(match(string, "[^[:ascii:]]"), "HAS-NON-ASCII", "ASCII") | eval hasUncode2=if(match(string, "[^\x00-\xff]"), "HAS-16-BIT", "8 BIT")  
Hi @Varun18 , no it isn't any direct way. The only workaround, if you have a Deployment Server, is to create (on this server) a monitor stanza that reads all the conf files in the apps in $SPLUNK_H... See more...
Hi @Varun18 , no it isn't any direct way. The only workaround, if you have a Deployment Server, is to create (on this server) a monitor stanza that reads all the conf files in the apps in $SPLUNK_HOME/etc/deployment-apps and sends them into an index. In this way, you can access these information by SPL. Ciao. Giuseppe
Hi Team, Is there a direct way to retrieve a list of usernames or accounts configured in Splunk Add-ons (such as those used in modular inputs, scripted inputs, or API connections) using Splunk SPL? ... See more...
Hi Team, Is there a direct way to retrieve a list of usernames or accounts configured in Splunk Add-ons (such as those used in modular inputs, scripted inputs, or API connections) using Splunk SPL? Regards, VK
@Iris_Pi   
Hello Everyone, I want to check if a field called "from_header_displayname" contains any Unicode. Below is the event source, this example event contains the unicode of "\u0445": "from_header_displ... See more...
Hello Everyone, I want to check if a field called "from_header_displayname" contains any Unicode. Below is the event source, this example event contains the unicode of "\u0445": "from_header_displayname": "'support@\u0445.comx.com' And the following what I see from the web console, the unicode has been translated into "x" (note: it's not the real letter x, but something looks like x in the other language) from_header_displayname: 'support@х.comx.com' I used the following search but no luck: index=email | regex from_header_displayname="[\u0000-\uffff]" Error in 'SearchOperator:regex': The regex '[\u0000-\uffff]' is invalid. Regex: PCRE2 does not support \F, \L, \l, \N{name}, \U, or \u. Please advise what should I use in this case. Thanks in advance. Regards, Iris
Hi @PickleRick I will try the below and update here. Thanks
Hi @chenfan , as @isoutamo said, open a new question it's easier to answer you and to have a faster and probably better answer. Anyway, I'm not an expert on Dashboard Studio that I use only when I ... See more...
Hi @chenfan , as @isoutamo said, open a new question it's easier to answer you and to have a faster and probably better answer. Anyway, I'm not an expert on Dashboard Studio that I use only when I cannot use Dashboard Classic, so I'm not sure to be able to help you. In the new question, please better describe your request because it isn't so clear the colour of which object you want to change. Ciao. Giuseppe
@randoj unfortunately, I cannot share the exact files. However, you should be able to get the incident id for each finding using its calculated rule_id (compare the eval statement for rule_id/event_i... See more...
@randoj unfortunately, I cannot share the exact files. However, you should be able to get the incident id for each finding using its calculated rule_id (compare the eval statement for rule_id/event_id in [Incident Review - Main] in SA-ThreatIntelligence/default/savedsearches.conf) via the mc_incidents collection, which has a field notable_id iirc. Then, use that id as a key against the mc_notes collection, and you can get notes for findings. Hope this clears things up a bit!
Hi @tamalunp  You could try with searchmatch maybe? | eval isFoo=if(searchmatch("[\"foo\"]"),"yes","no")   Full example: |windbag | head 1 | eval _raw="This is a test message [\"foo\"] bar" ... See more...
Hi @tamalunp  You could try with searchmatch maybe? | eval isFoo=if(searchmatch("[\"foo\"]"),"yes","no")   Full example: |windbag | head 1 | eval _raw="This is a test message [\"foo\"] bar" | eval isFoo=if(searchmatch("[\"foo\"]"),"yes","no") | table _raw isFoo  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hey @SGun ,  how did you end up implementing it?  Thanks! 
Hey @Dilsheer_P ,  did you find a way that worked for you?  Thanks!
[Solution] @Niro  You can get the desired result by modifying transforms.conf as follows: 1. /opt/splunk/etc/apps/myapp/local/transforms.conf [pan_src_user] INGEST_EVAL = src_ip=replace(_raw, ".*s... See more...
[Solution] @Niro  You can get the desired result by modifying transforms.conf as follows: 1. /opt/splunk/etc/apps/myapp/local/transforms.conf [pan_src_user] INGEST_EVAL = src_ip=replace(_raw, ".*src_ip=([0-9.]+).*","\1"), src_user_idx=json_extract(lookup("user_ip_mapping.csv",json_object("src_ip", src_ip),json_array(src_user_idx)),"src_user")   Result:
@patelmc If the application field is a search-time extracted field it's unavailable during ingest-time processing. If you want to use it during indexing you have to first extract it as an indexed fie... See more...
@patelmc If the application field is a search-time extracted field it's unavailable during ingest-time processing. If you want to use it during indexing you have to first extract it as an indexed field (and can subsequently forget it so that it doesn't get indexed). Bonus points question - why extracting those indexed fields? @victor1004k Don't put settings in system/local. Put them into a separate app so they're easier to maintain.
Hi @sainag_splunk  Thanks for your reply. I am using AppDynamics Saas controller version 25.1.2. I am not sure where is the option to specify font settings in dash studio, can you please help here? ... See more...
Hi @sainag_splunk  Thanks for your reply. I am using AppDynamics Saas controller version 25.1.2. I am not sure where is the option to specify font settings in dash studio, can you please help here?   Regards, Gopikrishnan R. 
[Solution] @patelmc  You can achieve the desired result by modifying the content below slightly. 1. /opt/splunk/etc/apps/myapp/local/transforms.conf [Active_Events] INGEST_EVAL= application=replac... See more...
[Solution] @patelmc  You can achieve the desired result by modifying the content below slightly. 1. /opt/splunk/etc/apps/myapp/local/transforms.conf [Active_Events] INGEST_EVAL= application=replace(_raw, ".*application=(\w+).*", "\1"), APP=json_extract(lookup("APP_COMP.csv", json_object("application", application), json_array("APP")),"APP"), COMP=json_extract(lookup("APP_COMP.csv", json_object("application", application), json_array("COMP")),"COMP")  2. Result
The easiest way to see _raw is open event and select from “event actions” sho source. then you see if there is e.g. some escape characters like \u0022 => “  
So this example shows that the LIKE works with the [ | makeresults | eval _raw="bla bla [\"foobar\"] bla bla" | eval hasFoobar = case(_raw LIKE "%[\"foobar%", "Y") | eval hasFoobar = if(hasFoobar = ... See more...
So this example shows that the LIKE works with the [ | makeresults | eval _raw="bla bla [\"foobar\"] bla bla" | eval hasFoobar = case(_raw LIKE "%[\"foobar%", "Y") | eval hasFoobar = if(hasFoobar = "Y", "YES", "NO") | table _raw, hasFoobar so there may be something odd with your data. Your example shows table message, not _raw. Can you provide an example of _raw