All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I already have ingest eval in place. I only need to extract fqdn from vs_name in order to match there. 
Hi @jialiu907  Have a look at the below, I've suggested 2 ways you can determine your Disconnect field based on that value, is this what you're after? | makeresults | eval _raw="<28>1 2025-02-19T... See more...
Hi @jialiu907  Have a look at the below, I've suggested 2 ways you can determine your Disconnect field based on that value, is this what you're after? | makeresults | eval _raw="<28>1 2025-02-19T15:14:00.968210+00:00 aleoweul0169x falcon-sensor-bpf 1152 - - CrowdStrike(4): SSLSocket Disconnected from Cloud." | rex "\)\:\s(?<Disconnect>SSLSocket Disconnected from Cloud)" | eval Disconnect2=IF(searchmatch("SSLSocket Disconnected from Cloud"),1,0)   Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
Hi @splunklearner  This is quite complex to achieve in props/transforms but shouldnt be impossible - lets have a go.. This is what it would look like as SPL - use this to tweak your eval to match y... See more...
Hi @splunklearner  This is quite complex to achieve in props/transforms but shouldnt be impossible - lets have a go.. This is what it would look like as SPL - use this to tweak your eval to match your field names and config, then apply to the transforms as below.  | makeresults | eval _raw="something=v-jupiter-prd-cbc-us.sony-443-ipv6" | eval hostType=replace(_raw, ".*v\-(?<hostType>[^\.]+)\.sony.*", "\1") | eval yourIndex=json_extract(lookup("testlookup.csv",json_object("hostType",hostType), json_array(index)),"index") ``` as one line ``` | eval yourIndexNew=json_extract(lookup("testlookup.csv",json_object("hostType",replace(_raw, ".*v\-(?<hostType>[^\.]+)\.sony.*", "\1")), json_array(index)),"index") You will also need a lookup in $SPLUNK_HOME/system/lookups - in this example its testlookup.csv. For the purposes of testing in SPL you can create a temporary lookup with this: | makeresults | eval hostType="jupiter-prd-cbc-us", index="index1" | outputlookup testlookup.csv Props/transforms.conf == props.conf == [yourSourcetype] TRANSFORMS-defineIndex = defineIndex == transforms.conf == [defineIndex] INGEST_EVAL = index=json_extract(lookup("testlookup.csv",json_object("hostType",replace(_raw, ".*v\-(?<hostType>[^\.]+)\.sony.*", "\1")), json_array(index)),"index") For more info on how the lookup command works, have a look at https://docs.splunk.com/Documentation/Splunk/9.4.0/SearchReference/ConditionalFunctions#lookup.28.26lt.3Blookup_table.26gt.3B.2C.26lt.3Bjson_object.26gt.3B.2C.26lt.3Bjson_array.26gt.3B.29 Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will  
Hi  i have data from two columns and using a third column to display the matches | makeresults | eval GroupA = 353649273, GroupB=353648649 | append [ | makeresults | eval GroupA = 353649184, Gro... See more...
Hi  i have data from two columns and using a third column to display the matches | makeresults | eval GroupA = 353649273, GroupB=353648649 | append [ | makeresults | eval GroupA = 353649184, GroupB=353648566] | append [ | makeresults | eval GroupA = 353649091, GroupB=353616829] | append [ | makeresults | eval GroupA = 353649033, GroupB=353638941] | append [ | makeresults | eval GroupA = 353648797] | append [ | makeresults | eval GroupA = 353648680] | append [ | makeresults | eval GroupA = 353648745] | append [ | makeresults | eval GroupA = 353648730] | append [ | makeresults | eval GroupA = 353638941] | fields - _time | foreach GroupA [eval match=if(GroupA=GroupB,GroupA ,NULL)] | stats values(GroupA) values(GroupB) values(match)   however nothing is getting displayed in values(match). is there something wrong in the logic or alternate way to do it 
I am looking to extract this section of an event and have it as a field that I am able to manipulate with. I am unfamiliar with regex and I am getting the wrong results.  Events   <28>1 2025-02-... See more...
I am looking to extract this section of an event and have it as a field that I am able to manipulate with. I am unfamiliar with regex and I am getting the wrong results.  Events   <28>1 2025-02-19T15:14:00.968210+00:00 aleoweul0169x falcon-sensor-bpf 1152 - - CrowdStrike(4): SSLSocket Disconnected from Cloud. <30>1 2025-02-19T15:14:16.104202+00:00 aleoweul0169x falcon-sensor-bpf 1152 - - CrowdStrike(4): SSLSocket connected successfully to ts01-lanner-lion.cloudsink.net:443    I am looking to have a field called Disconnect based on "SSLSocket Disconnected from Cloud"
I want to extract value from the following field while indexing the data to use it to map it with index. vs_name=v-jupiter-prd-cbc-us.sony-443-ipv6 I want to extract every field after v- and till s... See more...
I want to extract value from the following field while indexing the data to use it to map it with index. vs_name=v-jupiter-prd-cbc-us.sony-443-ipv6 I want to extract every field after v- and till sony. I.e., jupiter-prd-cbc-us.sony as fqdn so that this fqdn will check in lookup to map it to correct index. Please help me with props and transforms to extract fqdn correctly.
The reason I want to revert is because of this known issue: 2024-12-03 PSAAS-20901 supervisord failing to start on warm standby instance https://docs.splunk.com/Documentation/SOARonpr... See more...
The reason I want to revert is because of this known issue: 2024-12-03 PSAAS-20901 supervisord failing to start on warm standby instance https://docs.splunk.com/Documentation/SOARonprem/6.3.1/ReleaseNotes/KnownIssues When SOAR needs to be restarted on our warm standby it fails to start because supervisord can't start. The only workaround I've been able to find is disabling the warm standby so it's a primary. Then restarting SOAR after which I set the server as the warm standby again.
Thanks for the reply, I understand that the error is due to there being no results, but that is exactly what I require, that it does not throw an error when there are no results, since when saving my... See more...
Thanks for the reply, I understand that the error is due to there being no results, but that is exactly what I require, that it does not throw an error when there are no results, since when saving my correlation search it always throws an error and never completes a search. Is there any way to avoid this?
Thank you for you reply. I need completely different data source for Table depending on the dropdown selection. If value selected in dropdown is equal to "caddy", set Table datasouce to "ds_EHYzbg0g... See more...
Thank you for you reply. I need completely different data source for Table depending on the dropdown selection. If value selected in dropdown is equal to "caddy", set Table datasouce to "ds_EHYzbg0g", if value is "nginx", set Table datasouce to "ds_8xyubP1c":   "ds_EHYzbg0g": { "type": "ds.search", "options": { "query": "host=\"$select_hosts$\" program=\"$select_program$\" priority=\"$select_log_leel$\" | fields host,program,sourceip" }, "name": "logs_program_caddy" }    
Hi @livehybrid,   I've come to find out that monitoring the search itself is all I was able to find in the logs. I cannot seem to find a trace of an API sync or an API pull. I'm sure it exists, but... See more...
Hi @livehybrid,   I've come to find out that monitoring the search itself is all I was able to find in the logs. I cannot seem to find a trace of an API sync or an API pull. I'm sure it exists, but I can't find anything in the  _internal index related to it. Looking in there was also what was suggested by our technical representative.   I'll mark the monitor the sync as the solution as an alternative   Thanks!
Hi @gcusello ! Also interesting that the alerts in the index seems good: But the loading of the events in the Events dashboard never ending.    
Hi @kemeris  Are the program you want to filter by in the data source? Or do you need to load a completely different data source depending on the dropdown selection? Assuming you want to apply a fi... See more...
Hi @kemeris  Are the program you want to filter by in the data source? Or do you need to load a completely different data source depending on the dropdown selection? Assuming you want to apply a filter to the search based on the dropdown value you would do something like this: index=yourData platform=$platform$ Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
Hi Giuseppe! Thank you for your answer. I double checked, but my alerts are alredy global. I think there is another problem. Thanks, A
Hi @Andras , you can see in Alert Manager App only alerts share at Global level, so you have to change the permissions in your alerts from App level to Global level. Ciao. Giuseppe
Hello Everyone! I installed Splunk and Alert Manager Enterprise in Virtualbox for learning purposes (4cpu /8gb ram). I configured AME via the documentation. Health Check is green. I can send test ... See more...
Hello Everyone! I installed Splunk and Alert Manager Enterprise in Virtualbox for learning purposes (4cpu /8gb ram). I configured AME via the documentation. Health Check is green. I can send test alerts, they appear in the ame_default index.   However the alerts don't appear in the Events. Hang up forever. I have some broken pipe errors, but they also appear in an another working environment. Thank you for your help. A      
Was able to get it working this way.   index=kafka-np sourcetype="KCON" connName="CCNGBU_*" ERROR!=INFO _raw=* | eval error_msg = case(match(_raw, "Disconnected"), "disconected", match(_raw, "rest... See more...
Was able to get it working this way.   index=kafka-np sourcetype="KCON" connName="CCNGBU_*" ERROR!=INFO _raw=* | eval error_msg = case(match(_raw, "Disconnected"), "disconected", match(_raw, "restart failed"), "restart failed", match(_raw, "Failed to start connector"), "failed to start connector") | search error_msg=* | dedup connName | table host connName error_msg ERROR
@Ciccius  You need to configure Data Input similar to how you would setup File Monitor, Performance Monitors etc. Splunk would need to know what to read, from where to read and how frequently to rea... See more...
@Ciccius  You need to configure Data Input similar to how you would setup File Monitor, Performance Monitors etc. Splunk would need to know what to read, from where to read and how frequently to read, where to index and setting up source/sourcetype etc. These you would need to configure in inputs.conf either through Splunk Web or CLI. Refer to the documentation: Get data from APIs and other remote data interfaces through scripted inputs - Splunk Documentation Also read about Writing Reliable scripts documentation, as most of the time scripted inputs have a wrapper script as well as maintain your own last indexed data/recovery/parallel execution etc: https://docs.splunk.com/Documentation/Splunk/latest/AdvancedDev/ScriptSetup  Once you have completely tested and made your scripted input robust for your scenario, you may be able to build an Add on using Splunk Add On Builder or move towards creating your Modular Input to Splunk. https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/custominputs/ 
 I have drop-down named "Program" and Table with static datasource "ds_EHYzbg0g". How to define dataSource for Table dynamically based on value from drop-down "Program"?       { "opti... See more...
 I have drop-down named "Program" and Table with static datasource "ds_EHYzbg0g". How to define dataSource for Table dynamically based on value from drop-down "Program"?       { "options": { "items": [ { "label": "All", "value": "*" } ], "defaultValue": "*", "token": "select_program" }, "dataSources": { "primary": "ds_8xyubP1c" }, "title": "Program", "type": "input.dropdown" } { "type": "splunk.table", "options": { "tableFormat": { "rowBackgroundColors": "> table | seriesByIndex(0) | pick(tableAltRowBackgroundColorsByTheme)" }, "columnFormat": { "_raw": { "data": "> table | seriesByName(\"_raw\") | formatByType(_rawColumnFormatEditorConfig)" } }, "count": 50 }, "dataSources": { "primary": "ds_EHYzbg0g" }, "context": { "_rawColumnFormatEditorConfig": { "string": { "unitPosition": "after" } } }, "showProgressBar": true, "containerOptions": {}, "showLastUpdated": false }    
Hi all, I have configured a new script in 'Data inputs' to feed my index with data from a Rest API. The script has been written in python3, do a simple request to the endpoint, gather the data and ... See more...
Hi all, I have configured a new script in 'Data inputs' to feed my index with data from a Rest API. The script has been written in python3, do a simple request to the endpoint, gather the data and do some little manipulation of it,  and write it to the stout by the print() function. The script is placed in the 'bin' folder of my app and using the web UI I configured it without any issue to run every hour. I tested it running manually from the command line and the output is what I expected. In the splunkd.log I have the trace that Splunk ran it as the following: 02-19-2025 10:49:00.001 +0100 INFO ExecProcessor [3193396 ExecProcessor] - setting reschedule_ms=86399999, for command=/opt/splunk/bin/python3.7 /opt/splunk/etc/apps/adsmart_summary/bin/getCampaignData.py ... and nothing more is logged, neither errors nor anything else. But in the index i choose in the web UI there is no data coming from the script. Where I can start to check what is going on? Thanks!
Hey even we have come across the same recquirement to duplicate the grafana dashboard in splunk for observability..... Currently we have our k8's dashboard in grafana but now somehow we need to repl... See more...
Hey even we have come across the same recquirement to duplicate the grafana dashboard in splunk for observability..... Currently we have our k8's dashboard in grafana but now somehow we need to replicate it in splunk observability cloud... How can this be done? Thanks # splunkcloud # grafana