All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hello Everyone, I want to check if a field called "from_header_displayname" contains any Unicode. Below is the event source, this example event contains the unicode of "\u0445": "from_header_displ... See more...
Hello Everyone, I want to check if a field called "from_header_displayname" contains any Unicode. Below is the event source, this example event contains the unicode of "\u0445": "from_header_displayname": "'support@\u0445.comx.com' And the following what I see from the web console, the unicode has been translated into "x" (note: it's not the real letter x, but something looks like x in the other language) from_header_displayname: 'support@х.comx.com' I used the following search but no luck: index=email | regex from_header_displayname="[\u0000-\uffff]" Error in 'SearchOperator:regex': The regex '[\u0000-\uffff]' is invalid. Regex: PCRE2 does not support \F, \L, \l, \N{name}, \U, or \u. Please advise what should I use in this case. Thanks in advance. Regards, Iris
Hi @PickleRick I will try the below and update here. Thanks
Hi @chenfan , as @isoutamo said, open a new question it's easier to answer you and to have a faster and probably better answer. Anyway, I'm not an expert on Dashboard Studio that I use only when I ... See more...
Hi @chenfan , as @isoutamo said, open a new question it's easier to answer you and to have a faster and probably better answer. Anyway, I'm not an expert on Dashboard Studio that I use only when I cannot use Dashboard Classic, so I'm not sure to be able to help you. In the new question, please better describe your request because it isn't so clear the colour of which object you want to change. Ciao. Giuseppe
@randoj unfortunately, I cannot share the exact files. However, you should be able to get the incident id for each finding using its calculated rule_id (compare the eval statement for rule_id/event_i... See more...
@randoj unfortunately, I cannot share the exact files. However, you should be able to get the incident id for each finding using its calculated rule_id (compare the eval statement for rule_id/event_id in [Incident Review - Main] in SA-ThreatIntelligence/default/savedsearches.conf) via the mc_incidents collection, which has a field notable_id iirc. Then, use that id as a key against the mc_notes collection, and you can get notes for findings. Hope this clears things up a bit!
Hi @tamalunp  You could try with searchmatch maybe? | eval isFoo=if(searchmatch("[\"foo\"]"),"yes","no")   Full example: |windbag | head 1 | eval _raw="This is a test message [\"foo\"] bar" ... See more...
Hi @tamalunp  You could try with searchmatch maybe? | eval isFoo=if(searchmatch("[\"foo\"]"),"yes","no")   Full example: |windbag | head 1 | eval _raw="This is a test message [\"foo\"] bar" | eval isFoo=if(searchmatch("[\"foo\"]"),"yes","no") | table _raw isFoo  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hey @SGun ,  how did you end up implementing it?  Thanks! 
Hey @Dilsheer_P ,  did you find a way that worked for you?  Thanks!
[Solution] @Niro  You can get the desired result by modifying transforms.conf as follows: 1. /opt/splunk/etc/apps/myapp/local/transforms.conf [pan_src_user] INGEST_EVAL = src_ip=replace(_raw, ".*s... See more...
[Solution] @Niro  You can get the desired result by modifying transforms.conf as follows: 1. /opt/splunk/etc/apps/myapp/local/transforms.conf [pan_src_user] INGEST_EVAL = src_ip=replace(_raw, ".*src_ip=([0-9.]+).*","\1"), src_user_idx=json_extract(lookup("user_ip_mapping.csv",json_object("src_ip", src_ip),json_array(src_user_idx)),"src_user")   Result:
@patelmc If the application field is a search-time extracted field it's unavailable during ingest-time processing. If you want to use it during indexing you have to first extract it as an indexed fie... See more...
@patelmc If the application field is a search-time extracted field it's unavailable during ingest-time processing. If you want to use it during indexing you have to first extract it as an indexed field (and can subsequently forget it so that it doesn't get indexed). Bonus points question - why extracting those indexed fields? @victor1004k Don't put settings in system/local. Put them into a separate app so they're easier to maintain.
Hi @sainag_splunk  Thanks for your reply. I am using AppDynamics Saas controller version 25.1.2. I am not sure where is the option to specify font settings in dash studio, can you please help here? ... See more...
Hi @sainag_splunk  Thanks for your reply. I am using AppDynamics Saas controller version 25.1.2. I am not sure where is the option to specify font settings in dash studio, can you please help here?   Regards, Gopikrishnan R. 
[Solution] @patelmc  You can achieve the desired result by modifying the content below slightly. 1. /opt/splunk/etc/apps/myapp/local/transforms.conf [Active_Events] INGEST_EVAL= application=replac... See more...
[Solution] @patelmc  You can achieve the desired result by modifying the content below slightly. 1. /opt/splunk/etc/apps/myapp/local/transforms.conf [Active_Events] INGEST_EVAL= application=replace(_raw, ".*application=(\w+).*", "\1"), APP=json_extract(lookup("APP_COMP.csv", json_object("application", application), json_array("APP")),"APP"), COMP=json_extract(lookup("APP_COMP.csv", json_object("application", application), json_array("COMP")),"COMP")  2. Result
The easiest way to see _raw is open event and select from “event actions” sho source. then you see if there is e.g. some escape characters like \u0022 => “  
So this example shows that the LIKE works with the [ | makeresults | eval _raw="bla bla [\"foobar\"] bla bla" | eval hasFoobar = case(_raw LIKE "%[\"foobar%", "Y") | eval hasFoobar = if(hasFoobar = ... See more...
So this example shows that the LIKE works with the [ | makeresults | eval _raw="bla bla [\"foobar\"] bla bla" | eval hasFoobar = case(_raw LIKE "%[\"foobar%", "Y") | eval hasFoobar = if(hasFoobar = "Y", "YES", "NO") | table _raw, hasFoobar so there may be something odd with your data. Your example shows table message, not _raw. Can you provide an example of _raw
Hi @bowesmana Can you help me to check it, thanks!
Hello, Bellow is the solution for your question. 1. /opt/log/syslog-ng-sample.log May 13 15:09:09 1.2.3.4 sim: logging for test   2. /opt/splunk/etc/apps/myapp/lookups/lookup.csv host,host_value... See more...
Hello, Bellow is the solution for your question. 1. /opt/log/syslog-ng-sample.log May 13 15:09:09 1.2.3.4 sim: logging for test   2. /opt/splunk/etc/apps/myapp/lookups/lookup.csv host,host_value 1.2.3.4,myhostname  3. /opt/splunk/etc/apps/myapp/local/props.conf [mysourcetype] TRANSFORMS-host_override = host_override   4. /opt/splunk/etc/apps/myapp/local/transforms.conf [host_override] INGEST_EVAL = host=replace(_raw, "^\w+\s+\d+\s+\d+:\d+:\d+\s+([^ ]+)\s+.*", "\1"), hostname=host, host=json_extract(lookup("lookup.csv",json_object("host",host),json_array("host_value")),"host_value") 5. Result  
@Skins , @moja Hello, Bellow is the solution for your question. 1.  /opt/log/syslog-ng-sample.log May 13 15:09:09 1.2.3.4 sim: logging for test   2. /opt/splunk/etc/apps/myapp/lookups/lookup.csv ... See more...
@Skins , @moja Hello, Bellow is the solution for your question. 1.  /opt/log/syslog-ng-sample.log May 13 15:09:09 1.2.3.4 sim: logging for test   2. /opt/splunk/etc/apps/myapp/lookups/lookup.csv host,host_value 1.2.3.4,myhostname    3. /opt/splunk/etc/apps/myapp/local/props.conf [mysourcetype] TRANSFORMS-host_override = host_override   4. /opt/splunk/etc/apps/myapp/local/transforms.conf [host_override] INGEST_EVAL = host=replace(_raw, "^\w+\s+\d+\s+\d+:\d+:\d+\s+([^ ]+)\s+.*", "\1"), hostname=host,host=json_extract(lookup("lookup.csv",json_object("host",host),json_array("host_value")),"host_value")   5. Result
Hi @livehybrid  I have tested and it seems working fine, although I got few issues like getting "Invalid array length" (I had to refresh browser to fix this), and table displaying all rows, instea... See more...
Hi @livehybrid  I have tested and it seems working fine, although I got few issues like getting "Invalid array length" (I had to refresh browser to fix this), and table displaying all rows, instead of the number of row I specified (rows displayed = 10). The invalid array length is intermittent I have some follow-up questions just to make sure I understand. Thank you for your help. 1. a. Is there a limitation on the number of data source?      b. In my case,  I need to change like the following, correct?         "ds_index1" : "ds_index1"    (not "search1" : "ds_index1")          Can you explain what this mean: ds_index1" : "ds_index1"?     b. ds_xxxx is a random character created by Splunk, do you usually change it to readable format, or you just leave it?   (which one is best practice)     c. I also need to change $mysearch$ to $datasource_token$, correct? "viz_gE0iilm3": { "dataSources": { "primary": "ds_index1", "ds_index1": "ds_index1", "ds_index2": "ds_index2" }, "options": { "table": "> $datasource_token$" }, "type": "splunk.table" }  I was trying to choose the token when clicking single value. Please let me know if this is correct { "type": "splunk.singlevalue", "dataSources": { "primary": "ds_singlevalue1" }, "title": "Single Value 1", "eventHandlers": [ { "type": "drilldown.setToken", "options": { "tokens": [ { "token": "datasource_token", "value": "ds_index1" } ] } } ] } { "type": "splunk.singlevalue", "dataSources": { "primary": "ds_singlevalue2" }, "title": "Single Value 2", "eventHandlers": [ { "type": "drilldown.setToken", "options": { "tokens": [ { "token": "datasource_token", "value": "ds_index2" } ] } } ], }   Also, it doesn't load at the beginning, so I need to put the default token. Is this correct? "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "-24h@h", "earliest": "now" } } } }, "tokens": { "default": { "datasource_token": { "value": "ds_index1" } } }  
I need to find whether the string ["foobar"] exists in a log message.  I have a search query like some stuff | eval hasFoobar = case(_raw LIKE "%\"foobar%", "Y") | eval hasFoobar = if(hasFoobar = ... See more...
I need to find whether the string ["foobar"] exists in a log message.  I have a search query like some stuff | eval hasFoobar = case(_raw LIKE "%\"foobar%", "Y") | eval hasFoobar = if(hasFoobar = "Y", "YES", "NO") | table message, hasFoobar which gives YESes as expected. If I add a square bracket, whether escaped or not, I only get NOes.  E.g., some stuff | eval hasFoobar = case(_raw LIKE "%[\"foobar%", "Y") | eval hasFoobar = if(hasFoobar = "Y", "YES", "NO") | table message, hasFoobar some stuff | eval hasFoobar = case(_raw LIKE "%\[\"foobar%", "Y") | eval hasFoobar = if(hasFoobar = "Y", "YES", "NO") | table message, hasFoobar   Any advice?  
When a UF sends data via HTTP it uses the Splunk-to-Splunk protocol, which logstash doesn't support.
Can you fully expand an example of the search. I assume Channel a visible field in the event list? Have you explicitly specified Channel as a field in the SPL?